Happy is the man that findeth wisdom, and the man that getteth understanding.
-Proverbs 3:13
Saturday, October 31, 2015
Online Communication
I don't know that I'd totally agree with the statement that the authors believe that technology is always the answer. If used correctly and implemented well, it often can be. But it's not always. A large piece of the SDLC which the course covers deals with the need to analyze what the problem is before making a choice of how to fix it. And the course starts with several chapters related to understanding how businesses function before it gets into an substantive discussion of technology.
Even your example of the discussion we might have F2F, while true in just the right set of circumstances, doesn't necessarily work for a couple reasons. One is that how likely would it be that you would be able to get a group of people who are enrolled in the course together? Not likely, due to geography and differing schedules, which is why most of you are taking online courses to begin with.
And two is how often do you actually get a substantial conversation in a group? Does everyone actually get to participate? In many classes in a F2F environment, 90% of the students sit there and don't actually participate. Only a small handful will often dominate the conversation because not everyone can talk at the same time and the introverts like actually thinking about what they're going to say before they say it and by the time they decide what to say the conversation has moved on.
Technology levels the playing field a bit. Not everyone who uses Wikipedia contributes to it by writing or fixing articles, but enough do that it has become an invaluable resource which is comprehensive enough and accurate enough to put print encyclopedias out of business. I will fully agree that technology is often used ineffectively, inefficiently, sometimes just for the sake of using technology and not for a real business need, or using the wrong tool (a hammer when a miter saw is called for).
That, I hope, is the point of the course. If you don't have the right tool or can't speak the right language to get what you need from the IT department, you will have problems. Flip it around. It's not that technology is a hammer and actually talking to each other are the other tools but rather what are the various technology tools that can help us in a variety of situations? Thanks for the conversation starter. I'm glad that the asynchronous post you made allowed me to make an answer later since I was busy at the time you were making your initial post. How about anyone else? Examples of using technology effectively or not effectively at work or other places you spend your time?
The student's eyes were opened a bit, I think, recognizing how common it is for some people to dominate the conversation in a live group. We didn't really talk about this specifically, but as I think back on it later, I wonder how often it leads to bad, extreme ideas, simply because the extroverts who like to blurt things out without thinking about them end up directing most of the conversation. This seems particularly relevant as we are in the middle of election season. The follow-up comment by the student that the younger generation is overly involved in technology and do not know how to communicate face to face is sadly true. It isn't a reason to get rid of technology but rather a reason to focus on when and how to communicate effectively in a variety of situations.
I definitely agree with you there regarding upcoming generations who only know how to communicate through technology, even to people who are in the same room as them. Or who communicate electronically only to people who are far away and ignore the people close by them. Just look at a group of teenagers in practically any environment, and you'll see very little live interaction among them, which is sad to see. I went to Europe a couple years ago, and I was happy to turn off my cell phone for two weeks and just enjoy what was there in front of me rather than trying to stay up on the latest FB gossip. I enjoy the same when heading up to the mountains for some hiking or backpacking. Some literally go through physical symptoms of withdrawal if removed from an always-connected environment. That doesn't mean the technology is bad, just that the person hasn't learned to use other options or is bad at selecting which tool to use when.
Tuesday, December 16, 2014
Information Systems Success Models - An Annotated Bibliography
IS success is multidimensional and interdependent, so interactions between success dimensions need to be isolated. Success dimensions should be based on goals of the research as well as proven measures where available. The number of factors should be minimized. The key factors included in the model include system quality, information quality, system use, user satisfaction, individual impacts, and organizational impact.
Rai, A., Lang, S.S., & Welker, R.B. (2002). Assessing the validity of IS success models: An empirical test and theoretical analysis. Information Systems Research, 13(1).
IS success models are compared. One major factor that differs among models is the category of IS Use. Some models include Use as a process since it is a prerequisite to other factors, others an indicator of success since people won’t use a system if they haven’t determined it will be useful to them, and of course perceived usefulness vs. measured use. The Technology Acceptance Model suggests that perceived usefulness and ease of use directly impact user behavior and system use.
Seddon, P.B. (1997). A respecification and extension of DeLone and McLean’s model of IS success. Information Systems Research, 8(September).
Standard variance models assert that variance in independent variables predicts variance in dependent variables. Process models, on the other hand, posit that not only are the occurrence of events necessary but that it is a particular sequence of events that leads to a change in the dependent variable. The presented IS success model removes the process component of the DeLone and McLean’s model. The problematic model contained three meanings of information system use. One meaning is that use provides some benefit to the user. A second, invalid, meaning presented use as a dependent variable of future use (i.e., if the user believes the system will be useful in the future, they will use it now). The third, also invalid, is that use is an event in the process that leads to individual or organizational impact. The proposed model links measures of system and information quality to perceived usefulness and user satisfaction, which in turn leads to expectations of future system usefulness and then use. Observing benefits to other individuals, organizations, and society also impact perceived usefulness and user satisfaction regardless of system or information quality.
Velasquez, N.F., Sabherwal, R., & Durcikova, A. (2011). Adoption of an electronic knowledge repository: A feature-based approach. Presented at 44th Hawaii International Conference on System Sciences, 4-7 January 2011, Kauai, HI.
This article discusses the types of use for knowledge base users. It utilized a cluster analysis to come up with three types of users. This included Enthusiastic Knowledge Seekers, Thoughtful Knowledge Providers, and Reluctant Non-adopters. Enthusiastic Knowledge Seekers made up the largest group at 70%. They had less knowledge and experience and shared little if anything of their own but considered the knowledgebase articles to be of high quality and very useful. The thoughtful knowledge providers, 19% of the users, submitted quality articles to the knowledgebase, enjoy sharing their knowledge with others, had moderate experience, and were intrinsically motivated. The smallest group, Reluctant Non-adopters at 11%, were experts who were highly experienced and adept at knowledge sharing but lacked the time or intrinsic motivation to do contribute meaningfully. They considered the knowledgebase to be low quality and did not consider it worth their time to work on improving it.
Friday, October 31, 2014
Prioritization
The SWOT Analysis is an example of a Quadrant II activity that helps you understand where you should be focusing your time in order to be the most effective. A SWOT Analysis doesn't need to be overly structured or complicated. Spending a lot of time building a pretty SWOT template and training everyone on its use would be a good example of an unimportant activity. Put it out there and let it happen, whether you use a 2x2 matrix, bulleted lists, or more of a free-form mind map. The strengths and weaknesses are inward facing. They refer to inherent qualities of the company or department and what they're currently doing. Opportunities and threats are outward facing. They are qualities of the environment, actions of competitors, or imminent events that will have an effect on you. The goal is to build on strengths and take advantage of opportunities, while eliminating weaknesses and preventing threats from knocking you down.
In order to get a handle on where to focus attention, after brainstorming, it's important to group and rank the items you have listed. Provide additional details to determine the size of the threat or the amount of money an opportunity may be worth to you. Often there are connections between the internal and external analysis. You can leverage your strengths to take advantage of opportunities and avoid threats. Overcoming a weakness may open up new opportunities. So draw those connections and quantify each aspect of your analysis, but keep your analysis simple and visual. Keeping it all on one page will allow you and others to see how all the parts tie together. Provide additional information as a separate write-up and attach it on following pages. Of course, as you begin making decisions on what to focus on, you will come up with a more detailed plan, which is great, but the initial analysis should remain simple and understandable by anyone who picks it up.
People usually like showing off their good side, so it is easy to list the strengths, however realistic they actually are. It's more difficult for managers to get honest answers from their employees about real weaknesses and threats, so it is important to create a safe place when brainstorming the more negative aspects. Here's where having done your relationship building with your team will allow there to be enough trust to do this legitimately. You might have to use a technology solution to allow team members to submit weaknesses and threats anonymously if you don't have the trust to do so face to face. Being aware of and honest about your challenges can show as much or more strength as listing out what your strengths are.
Monday, August 4, 2014
Simplification
http://www.cio.com/article/2451671/it-strategy/simplifying-it-pays-off-with-big-savings-better-business-success.html
Friday, December 28, 2012
Web Design Workshop
One week summer workshops are a great option to take an intense, short-term class that teaches you some real skills to begin putting into place immediately.
They've changed a little, as they're now taught throughout the summer, where they used to be taught during a workshop week between two of the four week terms. I guess that would be nice if you wanted to go through more than one workshop.
Looking at their current offerings, they have workshops related to using Photoshop, creating a resume and portfolio, sign language, understanding the history of China, and appreciating fantasy fiction. Okay, maybe all of those might not be immediately useful.
The summer workshop I went through was about using Dreamweaver and Fireworks. This was over a dozen years ago, well before Adobe bought out Macromedia, and the web still had a bit of a wild west feel to it. Sites were simple, designs were kitschy, and digital cameras were low-res and high-priced. I actually borrowed a digital camera from my department to do a couple websites after the workshop, and it totally reminded me of Luke Skywalker's binoculars. It looked like his binoculars, and the quality probably wasn't much better. Of course at that time, you couldn't print digital photos anyway, and monitor resolutions on those fat old CRTs weren't good enough to be able to tell that the quality was low. About the best you could do on old CRTs to make them half decent was to crank up the refresh rate so they didn't flicker. You wanted something higher than 60 Hz so the flicker wasn't visible, but if you went too high, you could damage the monitor, so you had to decide how much you wanted to gamble. It was always fun degaussing old CRTs as well. Here's a photo I actually took at the time. Note the little border I added around the edge for no good reason. Then there's all the dirty noise that almost gives it an instagrammy feel. I think most people taking photos on all but the nicest cell phones these days could just upload as is and say they used an instagram filter. That's probably why people like instagram, because it makes their phone photos look like they're supposed to be old and dirty. Someone recently sent out a photo from a major work event, of over 100 posed people. It looked awful - faces all blurry and washed out. I looked at the picture's metadata, and sure enough, it was taken on an iPhone4. Why someone would waste the time of that many people to get them all posed and then just take their photo with a camera phone, I don't think I'll ever understand. We have nice cameras now - use them.The workshop was a fun one. It was one week, several hours in the morning and afternoon, every day. We even got brownies each day during an afternoon break. We each made a personal site and showed it off to everyone. They were all terrible, I'm sure. We learned the basics of using tables, lists, font formatting options, frames (I know), linking, creating buttons that changed when you moused over or clicked on them, etc. I figured out some basic JavaScript that randomly picked a different picture to show on the home page each time it loaded. Several of us used AnimationFactory, which is surprisingly still around, to find 3D-ish animated gifs to put on our sites. I remember after showing off our sites, we sat around watching random videos. In particular, I remember showing Weird Al's music video The Saga Begins (American Pie). Still a great video.
The most important part of the workshop is that I now had a few days' worth of exposure to Dreamweaver and thus could add it to put something on my resume. The resume with that skill listed got me a job on campus at Career Services. They had various IT-related tasks that needed to be done around the office, but the biggest one was launching a new website. Don't get me wrong - I'd been making notepad websites for several years at that point, but they wanted the site done in Dreamweaver, and I was the man to do it. That job was a really fun one and a great career starter. I used it as an internship, so I'll write more about it when I discuss the internship class.
Tuesday, February 7, 2012
IPv6: The Day the Routers Died
If you're not familiar with the problem, think about what happens when a state or city runs out of phone numbers and has to add a new area code. But what happens when we run out of area codes? Or what will happen when we run out of Social Security Numbers?
We have been talking about it for awhile.
There's this video from 4 or 5 years ago (long but funny).
And this video from just a couple years ago (long, not so funny, but informative).
It goes back further than that, but suffice it to say it's not taking anybody by surprise. It's now been a year since all the IPv4 address blocks ran out (No more IPv4 addresses, Internet Runs Out Of IP Addresses), although it will be awhile before individual addresses are all completely allocated. There are plenty of techniques to run multiple devices behind one IP address, and there may be some ways to recover some previously unused or unusable addresses. These workaround can cause as many problems as making the IPv6 jump might make, so it makes sense to get moving. As Randy Bush explains in the second video, those people and companies who get it figured out now will be leaps and bounds ahead of those who wait until crunch time.
Friday, January 6, 2012
Projects in Visual Basic
Given the quick and dirty nature of it, it's no surprise that it's the language taught to business students who need to understand a bit of programming but don't need to be able to create their own compiler or operating system. One thing I liked about this course was that of all the programming courses I've taken, it was one of the few actually taught by faculty in my department. I worked in the Computer Science department for 5 years and don't have anything against CS faculty at all, but there's a greater connection having someone from your own department teaching a course.
As a case in point, look at this particular course. When I took it, it was the last semester that just one undergraduate version of the course was taught. The whole semester, the professor who taught it would bag on accounting students or other non-MIS business majors who didn't know anything about computers or programming. They were in the same section as the students with more technical majors, because there wasn't another option. All we heard about was how the professor wanted to move on to all these advanced concepts but couldn't because of the accounting students holding us back. This, of course, was the same thing I heard as an MIS student in CS classes, how we weren't real programmers like the CS students. So CS bags on MIS; MIS bags on accounting; maybe accounting bags on human resources or marketing? At least marketers have a handle on social media, so it's probably HR that's the bottom of the technology food chain.
However the pecking order goes, the next semester would see two versions of the course, for technical and non-technical business majors. I don't know how much they lightened the load in the non-technical version, since VB is already a junior version of programming.
For an example of a quick program I wrote in VB several years later, I created a small program to automate common tasks in the testing center I used to run. I created a map of the lab with a button for each computer and for common actions. You would select the computer and then click the button for the action you wanted to perform: view the screen, reboot it, shut it down, turn it on, or mark that it was being used to take a certain test. It even had an option to cancel the shut down command if you accidentally selected the wrong computer and realized it within 5 seconds. Students get freaked out for some reason when their computer shuts down on them in the middle of a test. The funnest options were to start up or shut down all the computers at the same time. It was like a race to see which computer would boot up first, and the silence when all the fans came to a stop was so peaceful.
The program was pretty simple and was very inelegantly written but saved a ton of time for those working the lab. An employee of mine sought to improve on my design and write a fancy version 2.0 with all kinds of customization options to work in other computer labs, but quick and dirty won the race. As long as he worked on it, he never got version 2.0 working.
Friday, December 23, 2011
Database Management
We started out learning the basics of databases. What's a table, row, query, DBMS, join, etc.? Unlike the class I had taken where I should have learned some of these introductory concepts, it was actually clear this time around how they would be used.

We had plenty of time in class to practice our new SQL skills on our own and in groups, learning from mistakes and celebrating successes. Ultimately the course built up to a culminating project. It had to have a database of some kind involved, obviously. Other than that, it was up to us to determine what we would implement based on the other technical skills we had. I don't remember what my project was other than I worked like crazy on it, and it ended up close but never functioned quite right. Thinking back on it, I doubt the professor spent much time actually using our final projects and checking in depth how well they worked, as long as our documentation was in order and it looked like we had done something big enough, but it was motivating to be able to choose our own project and figure out how to make it work.
Wednesday, September 28, 2011
Principles of Business Information Systems
The part that was weird was that this was a 300-level course. It's something I never understood. Apparently other people didn't understand it either, because they have since changed it to a 200-level course.
I just never really got it and still don't. If they would change it to a 1 credit 100-level orientation course, I'd totally understand. What are some career options? What do IS professionals contribute to the business environment? Basic terminology related to hardware and software. Trends in the field. Professional organizations. I remember writing a two page paper about my brother in law's business, selling calculators and cables, since back in the 90s, this crazy new e-commerce thing was a pretty big deal.
They actually added some Excel and HTML/CSS to the course. Yes, when they dropped it down from a 300-level to a 200-level course they added more stuff to it. That may have been to keep them from having to drop it to a 100-level course.
Tuesday, September 27, 2011
Data Communication and Networking
Something I still wonder about to this day was the deal with the McDonald's cup my instructor would be sipping on every class. It was one of those jumbo paper cups, not a mug, and they don't last that long, so I'm sure it was a new one every time. It made me wonder if he just ate out on days we had class or if he ate at McDonald's every day. Maybe he did just save that same cup and reuse it all the time.
On the first day of class, we took a pretest. We were promised that anyone who received higher than a certain score (maybe 70) could receive an automatic A in the class and wouldn't have to attend the rest of the semester. The test was basically a preview of the final exam, and it was hard. Nobody came anywhere close. It does make me wonder what he did with the test results. I don't know if he used the results to actually determine where we were knowledgeable and where we lacked in aggregate to guide his lectures. I've always thought it was a great idea but have never seen anyone else start off a class this way.

Wednesday, September 21, 2011
Spreadsheets and Databases
We had a seating chart, where we had to sit in the same seats every time. We got to pick where but once we picked, it was set, so that the professor could use his map of names and where people sat. I thought it was nice that he was using some type of system to learn our names. I was lucky to have a professor who was actually savvy in the way of databases. Others I know had someone who didn't understand databases at all teaching the course, so instead of spending 50% of the time each on Excel and Access they would spend 12 weeks on Excel and maybe 3 weeks on Access.
I did learn a lot about Excel. I don't know that I ever really "understood" databases from this class, but I learned a lot of the basics. I could perform the various required tasks but didn't have a solid grasp of why you would actually do what we were doing. A roommate of mine sat next to me. I remember him asking one day for the professor to explain how we might use this in real life. It was just too abstract for him, as it was for me and probably most everyone else in the class. Nobody else had the guts to actually ask why we were learning this stuff. I don't remember his answer other than it wasn't a good one.
This makes me think of one of the key principles of andragogy or adult education, which is that adults will do better if they understand why they are learning something. I contend that children and young adults, for whom a pedagogical or teacher-centered approach is traditionally used, would also do better if they understood the why. The difference is in the power relationship we have when teaching younger students. They still want to know and sometimes are even willing to ask how what they are learning will be useful to them. Because of the imbalance of power, we blow them off instead of taking them seriously. Adults simply hold their ground and require you to come up with a good answer while young students who hold their ground are disciplined.
Monday, December 21, 2009
Be Weird
Recognition of different races is something that should have been pretty obvious to test. Recognition in poor lighting situations is another obvious one. HP likely did test both of these items.
It is just as important for QA teams to test things that aren't quite so obvious, such as the combination of race and poor lighting. What if you throw glasses, a hat, and headphones into the mix? What about glittery makeup? Facial hair? Vibration from using a laptop with the camera while riding as a passenger in a car or bus?
How far do you take it? Where are the reasonable limits?
For a good QA team, that's a trick question. There are no limits.

What did they do about the bug I found? Nothing. They determined it was not likely enough to actually happen to warrant setting up the unit to filter where it received video from. It didn't matter to me. My job wasn't to determine what the programmers were to work on. It was to do weird stuff and report the results. Now that I manage programmers and QA testers, it is my job to prioritize what gets worked on and to remind the QA testers to stay weird.
Wednesday, September 30, 2009
Free Software
The importance of openness in security was brought up, specifically how big software companies will generally try to hide vulnerabilities to protect the illusion of security, as opposed to the way open source projects just acknowledge and fix the security holes.
The conversation somehow turned into Abe accusing someone else of participating in an underground economy and personally benefiting from using free software at the expense of taxpayers, who are paying his salary (which is so false it doesn't even merit a reply). He continued on pointing out that free software isn't actually free, since there are all kinds of costs associated with it.
Of course there are costs associated with any software. The "free" doesn't mean that there are no acquisition costs but that once you have acquired it, you are free to do with it what you want. Proprietary software generally costs up front to purchase it, and then you are at the mercy of the software developers to make changes to the software if that is desired or needed. If you need a new feature and they don't want to implement it, you'll never get it. With free software, you may or may not pay up front to purchase it, but you are of course likely to invest in training, hardware, and other costs to actually implement it. The nice thing is that once you've implemented it, if you need a new feature, you can just add it or pay someone else to add it. If the original developer won't do it for you, it doesn't matter. You're free to change it if you want as long as you're willing to share your changes with others.
It was pretty obvious to everyone else that Abe didn't know what he was talking about, since he kept referring to money instead of freedom, so someone finally called him a troll. It didn't end there as he made a joke about trolls that showed he didn't know what a troll was. Someone else referred him to Wikipedia's article on trolls, after which Abe backed off and claimed he was just acting as devil's advocate and pointed out that the debate could just go back and forth all day so wasn't worth continuing.
I'm pretty sure he didn't understand all the arguments against his position or else he was the dumbest devil's advocate ever. Either way, he realized he was outmanned. The biggest piece that he was missing was not whether there are costs associated with implementing free software but that there are very real costs associated with not being permitted to maintain proprietary software yourself after implementing it. Can you really afford the lack of control over whatever platform you deploy if you use something other than free software?
Tuesday, February 17, 2009
Friday, December 19, 2008
SteadyState
In looking around for something to help keep computers under control, I found Windows SteadyState, which is a free program put out by Microsoft for locking down Microsoft Windows. As much as I dislike some of Microsoft's business practices and their frequent security problems (stop laughing Mac fans - the Mac OS has been bitten by malware as much as Windows has lately), this is a program that appears to have what it takes to really lock down a computer. It won't help in cleaning one up after the fact but in keeping it from getting messed up in the first place.
I used it to create an account for the kids. The account is locked down so no programs can run except Internet Explorer. Then IE is locked down so it is more limited than normal. You can set it up with a whitelist or only specific sites that can be visited, but I didn't turn that feature on.
If something strange does get installed even with the limited version of IE that is running, when you log out of the account, all changes made to the computer are automatically removed. Pretty cool. You can unlock the account so you can make changes and then just lock it back up.
I'm still playing with it, so I don't have a full review for it yet, but I recommend trying it out.
If you don't want to totally lock down your computer to just a small list of websites but still want good protection, I recommend K9 Web Protection from Blue Coat Systems. It's free for personal use. It lets you pick from a huge list of categories of sites that you can block and logs all sites that are visited.
And if you do happen to get the Antivirus 2009 trojan installed on your computer, I've found System Restore, which is automatically enabled in both Windows XP and Vista, to be the easiest option to remove it.
Monday, March 31, 2008
But you told me you were reliable
Of course, to start off, you have to define some things, like what reliability means to you and how much you really need it. Are you running an ecommerce site that gets millions of hits and sales per day, with a minute or two of downtime or sluggish page loading leading to thousands in lost revenue? Are you running a private family wiki to plan your family reunion next year that probably no one will notice if it's down a day or two? How much is reliability really worth to you? You can find plans ranging anywhere from $4 to $400 per month for hosting services, all with different levels of guaranteed service and different amounts of storage space and file transfer or bandwidth.
Going back to what reliability means, we might ask what are the causes of unreliable data or data loss? Well, there's user error, such as deleting or saving over the top of a file. Then there is hardware failure where a drive actually crashes and everything on it is lost or the network is disrupted in some way and your data is temporarily unavailable. Chances are most hosts will have something in place to keep their hardware running, with some form of RAID and maybe even clustering. RAID is a redundant set of hard drives where one can die and the others keep running. Clustering is a similar concept, but instead of just redundant disks, you have redundant servers. Throwing a couple more disks in a machine is a lot cheaper than setting up a completely separate server, so be prepared to pay a lot more for clustering.
WestHost claims to have 99.9% uptime, which is about 40 minutes of downtime per month. Site5 is another host who has a 99.9% uptime guarantee, where you will get a prorated credit if there are unscheduled outages over 45 minutes in a month. How do you verify that? Well, you can look to someone like Netcraft who monitors the web and tells you all sorts of things like what operating systems web servers are running and how long they have been running since they rebooted. They also have a page that shows hosting providers and shows how quickly they respond and how much downtime they have experienced. That chart is nice, if your host is one that has paid to be included on it.
In order to recover from PEBKAC errors (problem exists between keyboard and chair), look for a host that provides the ability to create cron jobs, which are scheduled tasks you can use to back up your files, so in case you delete or save over something, you can recover it. You'll probably have to set these up yourself. If you have enough disk space on your host's server, you can backup to that same server, but you'll want to backup to another location as well. Don't schedule so many offsite backups that you use up your monthly bandwidth quota, though.
Many companies will have a 30 or 60 day money back guarantee or a free trial period. Talk to people you know about how they like their host. Look at what kind of tech support availability they have. Although many companies will let you get by without a contract, most will give you a pretty decent discount for prepaying a year or two in advance, so you'll want to check to see how much of that is refundable if you want out after the guarantee period.
So, when choosing a host and a plan, it all depends on what you need and what you are willing to pay. No matter how much you pay and how foolproof your host's backup plan seems, never trust a single layer of backup. The key word is redundancy. How redundant? At least one layer deeper than you think you'll ever need.
Wednesday, March 26, 2008
Where'd my stuff go?
Kaushal Sheth recently posted about his web hosting woes, when his host died one day. He's moved on to a more stable one, Host Monster. WestHost is a good one that looks to be good, stable, and a pretty good price.
Wherever you end up, prices are low enough that you want to shop around and really make sure you get the service you need. Look for the availability of server side scripting languages like PHP and databases like MySQL or PostgreSQL. Many hosts have free applications you can use for hosting your blog, but as long as they have a database and programming language support, you can always upload your own applications. You don't have to use their wiki or photo sharing program. Just find an open source one and upload it yourself.
I personally use mostly the free services provided by companies rather than put together my own fully integrated site, but that is mostly because my research is headed in that direction - using existing Web 2.0 tools to collaborate and integrate content with other people, since that's what the majority of web users, who are not necessarily programmers, are going to be doing.
Whatever direction you end up taking, create a backup plan. After you've lost your entire life's work, it's a little late.
Thursday, March 20, 2008
Doctor G: You need a PPT transplant
I haven't used Google Docs a ton yet, but from what I remember, the word procesor and spreadsheet functionality has been there for longer than the presentation functionality. None of it is very advanced, but it covers the needs of probably two thirds of the users out there. OpenOffice is a step up, and MS Office is another step up. OpenOffice is probably good enough for 95% of the users, with only a small number actually needing MS Office. We won't go into the differences between Office 2007 and previous versions here.
The cool thing about Google Docs, however, is the collaborative features which are not present in other office suites. You can have multiple people editing the same document at the same time, and the changes that are made are logged. You can easily share a document with others to view or to edit, without ending up with multiple copies of your file sitting on email folders and saved random places on hard drives and USB drives. There's no question who's got the latest version.
Another feature of the presentations is that you can launch a presentation and then people can join in your presentation via the web and chat along with you next to the presentation pane. Add voice from Google Talk to the presentation, and you've got a sweet product. Actually, I'm hoping they'll add video to their Google Talk application as well so I can get off Skype.
I recently sat in on a presentation by a company Xapio that claims to have solved all of these problems with tracking versions of documents and who has read or otherwise accessed them. However, all their product consists of is a special mail server that strips out attachments, saves the attachment on their server, and sends the link. They can then log who downloads the file using that link. That's nice, but you still end up having to download the file and work with it on your local machine. What if you have a bunch of people working on a document together? You'll have all the versions of the file saved on their server, but only if they have been emailed back and forth. They were doing some cool things with protecting confidential information from going out via email, and the lawyers will love that you can log whether or not someone has read a document you sent them, but it's not really collaborative.
Microsoft is so far missing the collaboration boat as well. They have Sharepoint, but it suffers from the high cost, confusing licensing terms, and of course the security and administration headache that comes from using MS products. Sharepoint has real-time presence information, wikis, blogs, calendars, document collaboration features, etc. Some large enterprises with large IT budgets are obviously going to be better off using Sharepoint, but small companies, nonprofits, students, families, etc. can have pretty much the same thing using free tools hosted on a server they don't have to manage.
Wednesday, February 27, 2008
I Am Supernode
I have been using the campus Jabber server to chat with the consultants in our testing lab but hadn't really been using anything to chat with the programmers. Since we all installed Skype, we've been chatting through that, so I actually talk to them more than I had been, even though it only takes about 7 seconds to walk out my door into their office.
So I got an email from one of the campus security team saying that my computer was serving as a supernode for about 400,000 Skype clients. Basically, Skype uses a peer to peer network for handling traffic, and since I had a fast enough computer with a high speed internet connection that was always on and logged into Skype, I got volunteered to route calls through my machine for people. Yikes.
One site that talked about how to keep from getting wrestled into being a Skype supernode was much less technologically advanced than I hoped it would be. Their suggestion? Block Skype with your firewall and only run the program when you are planning to make or receive a call. That's not ideal, but I suppose it will work, except I now have to pick a different communication method to determine when we need to load up Skype so we can communicate. Google Talk may work, since you can make voice calls through it; it doesn't do video, but none of us are really that exciting to look at anyway.
So, the lesson for today is be careful with Skype.
Tuesday, January 1, 2008
Investing in Security
Because of the complicated nature of many businesses, CxOs rightfully demand some monetary justification for pumping money into security rather than other projects that also need funding. According to the authors of this article, the traditional accounting measure, ROI, does not completely capture the benefits and risks associated with Information Security, so other measures must be used to find a project's Return on Security Investments (ROSI).
The first myth dispelled in the article is that "the accounting concept of 'return on investment' is an appropriate concept for evaluating information security investments." The problem with ROI is that it is an historical accounting measure. The IRR, an economic measure of future asset values and discounted cash flows, is a truer measure.
The second myth is that "Maximizing the IRR on information security investments is an appropriate objective." That is, the highest IRR is not necessarily the best choice. The amount of investment that yields the greatest net benefit is the one that should be used. The IRR measures a percentage return, not an actual return, so the IRR is used to a point, but to determine the actual amount to invest, the proportion with the largest actual return should be used. I'm not so sure that I agree with this point, but of course none of my articles have been published in Strategic Finance yet. What it looks like to me is that if another project has a higher marginal IRR, it will have a higher present value than the IS project, so the alternative project should receive the additional funding. This may be implied, but the article seems to focus on comparing a project's IRR only to itself, not to alternative projects.
The third myth is that "IRR and NPV are ex post metrics for evaluating the actual performance of information security investments." IRR is used to anticipate the returns of a project. Going back to measure the actual return on a project once it has been completed is called "post-auditing." Because of the nature of security investments, the more successful a project turns out to be, the fewer security problems will be seen, and the harder it will be to measure how successful it was. So the most successful project will result in no one even thinking about the security plan.
The fourth myth is that "it's appropriate to invest in security activities up to the level where the investments equal the expected loss from security breaches." The probability of an event taking place must be factored in when determining the present value of those investments. So, with this final point, the authors tell us that optimal level of security investments must be found, not just the project's rate of return.
In their research, they have found that on average, a maximum of about one-third of the expected potential loss should be invested in preventing that loss. By investing more than this, the amount spent preventing a loss approaches the amount of the loss. By investing less than this, a firm leaves its systems completely open.
Gordon, L.A. & Loeb, M.P. (2002). Return on information security investments: Myths vs realities. Strategic Finance, 84(5).