A project is a temporary endeavor. Its successful completion results in the creation of a new or improved product, service, process, or other result.

Being temporary means it should have a distinct beginning and end. In some project-based organizations, the temptation may be to drag the project on forever as a form of job security. The best job security, however, is being efficient at finishing projects and knowing your successful performance means you’ll always be reassigned once your current project is over.

Operations and processes just keep going on without a distinct beginning or end. An assembly line may be used to build a car from beginning to end, but as a whole, the assembly line is really a process that continually creates new cars over and over. If an inefficiency in the process is found, a project may be undertaken to overhaul the process, but once the new process is in place, it goes on with no planned end in sight.

Operations are important to the consistent functioning of a business. But don't underestimate the transformational power of a good project.

# Rob Barton

Happy is the man that findeth wisdom, and the man that getteth understanding.

-Proverbs 3:13

## Tuesday, March 24, 2015

## Saturday, February 28, 2015

### Cardinal Wolsey

When I am forgotten, as I shall be,

And sleep in dull cold marble, where no mention

Of me more must be heard of, say, I taught thee,

Say, Wolsey, that once trod the ways of glory,

And sounded all the depths and shoals of honour,

Found thee a way, out of his wreck, to rise in;

A sure and safe one, though thy master miss'd it.

Mark but my fall, and that that ruin'd me.

Cromwell, I charge thee, fling away ambition:

By that sin fell the angels; how can man, then,

The image of his Maker, hope to win by it?

Love thyself last: cherish those hearts that hate thee;

Corruption wins not more than honesty.

Still in thy right hand carry gentle peace,

To silence envious tongues. Be just, and fear not:

Take an inventory of all I have;

My robe, and my integrity to heaven, is all

I dare now call mine own. O Cromwell, Cromwell!

Had I but served my God with half the zeal

I served my king, he would not in mine age

Have left me naked to mine enemies.

*Image "Cardinal Wolsey Christ Church" by Sampson Strong (circa 1550–1611)*

## Tuesday, January 6, 2015

### Haiku

I jokingly told my daughter who is supposed to do a presentation of some type on the seasons that she should do it as a haiku. I was looking up the "rules" since I couldn't remember how many syllables were supposed to be in each line. I found a site that talked about haiku, with all the rules and a bunch of examples. There are some great ones on that page. I really like the Christmas one about three quarters of the way down the page.

The basics are the 5 | 7 | 5 syllables per line, and it doesn't have to rhyme. What I had either forgotten or not known is that it is supposed to be seasonal, even if not obviously seasonal. And it's supposed to have a twist of some kind. So there are two halves, with some change from one to the other that provides a new perspective. Of course she had to do it, because of the season thing, but I still couldn't convince her, so she's doing a boring poster with a sun and the tilt of the earth across the different seasons.

So I decided to write a haiku for each season. Since they don't generally have titles (which would be kind of cheating on the 17 syllables thing, I grabbed some great Creative Commons licensed pics from Flickr to accompany each. Sure, each pic is worth 1000 words, but no syllables, so here they are with my four haiku:

The basics are the 5 | 7 | 5 syllables per line, and it doesn't have to rhyme. What I had either forgotten or not known is that it is supposed to be seasonal, even if not obviously seasonal. And it's supposed to have a twist of some kind. So there are two halves, with some change from one to the other that provides a new perspective. Of course she had to do it, because of the season thing, but I still couldn't convince her, so she's doing a boring poster with a sun and the tilt of the earth across the different seasons.

So I decided to write a haiku for each season. Since they don't generally have titles (which would be kind of cheating on the 17 syllables thing, I grabbed some great Creative Commons licensed pics from Flickr to accompany each. Sure, each pic is worth 1000 words, but no syllables, so here they are with my four haiku:

Frigid, wind-whipped, dark,

Sullen stillness, empty streets.

Introvert's blanket.

Golden flowers bloom.

Wildlife fills the savannah.

Dandelions roar.

She reclines in sand,

Ocean waves in the distance.

Aye, mocking mirage.

Final drops, warmth drained,

He leans into coming cold.

## Tuesday, December 16, 2014

### Information Systems Success Models - An Annotated Bibliography

DeLone,W.H. & McLean, E.R. (1992). Information systems success: The quest for the dependent variable. Information Systems Research, 3(1).

IS success is multidimensional and interdependent, so interactions between success dimensions need to be isolated. Success dimensions should be based on goals of the research as well as proven measures where available. The number of factors should be minimized. The key factors included in the model include system quality, information quality, system use, user satisfaction, individual impacts, and organizational impact.

Rai, A., Lang, S.S., & Welker, R.B. (2002). Assessing the validity of IS success models: An empirical test and theoretical analysis. Information Systems Research, 13(1).

IS success models are compared. One major factor that differs among models is the category of IS Use. Some models include Use as a process since it is a prerequisite to other factors, others an indicator of success since people won’t use a system if they haven’t determined it will be useful to them, and of course perceived usefulness vs. measured use. The Technology Acceptance Model suggests that perceived usefulness and ease of use directly impact user behavior and system use.

Seddon, P.B. (1997). A respecification and extension of DeLone and McLean’s model of IS success. Information Systems Research, 8(September).

Standard variance models assert that variance in independent variables predicts variance in dependent variables. Process models, on the other hand, posit that not only are the occurrence of events necessary but that it is a particular sequence of events that leads to a change in the dependent variable. The presented IS success model removes the process component of the DeLone and McLean’s model. The problematic model contained three meanings of information system use. One meaning is that use provides some benefit to the user. A second, invalid, meaning presented use as a dependent variable of future use (i.e., if the user believes the system will be useful in the future, they will use it now). The third, also invalid, is that use is an event in the process that leads to individual or organizational impact. The proposed model links measures of system and information quality to perceived usefulness and user satisfaction, which in turn leads to expectations of future system usefulness and then use. Observing benefits to other individuals, organizations, and society also impact perceived usefulness and user satisfaction regardless of system or information quality.

Velasquez, N.F., Sabherwal, R., & Durcikova, A. (2011). Adoption of an electronic knowledge repository: A feature-based approach. Presented at 44th Hawaii International Conference on System Sciences, 4-7 January 2011, Kauai, HI.

This article discusses the types of use for knowledge base users. It utilized a cluster analysis to come up with three types of users. This included Enthusiastic Knowledge Seekers, Thoughtful Knowledge Providers, and Reluctant Non-adopters. Enthusiastic Knowledge Seekers made up the largest group at 70%. They had less knowledge and experience and shared little if anything of their own but considered the knowledgebase articles to be of high quality and very useful. The thoughtful knowledge providers, 19% of the users, submitted quality articles to the knowledgebase, enjoy sharing their knowledge with others, had moderate experience, and were intrinsically motivated. The smallest group, Reluctant Non-adopters at 11%, were experts who were highly experienced and adept at knowledge sharing but lacked the time or intrinsic motivation to do contribute meaningfully. They considered the knowledgebase to be low quality and did not consider it worth their time to work on improving it.

IS success is multidimensional and interdependent, so interactions between success dimensions need to be isolated. Success dimensions should be based on goals of the research as well as proven measures where available. The number of factors should be minimized. The key factors included in the model include system quality, information quality, system use, user satisfaction, individual impacts, and organizational impact.

Rai, A., Lang, S.S., & Welker, R.B. (2002). Assessing the validity of IS success models: An empirical test and theoretical analysis. Information Systems Research, 13(1).

IS success models are compared. One major factor that differs among models is the category of IS Use. Some models include Use as a process since it is a prerequisite to other factors, others an indicator of success since people won’t use a system if they haven’t determined it will be useful to them, and of course perceived usefulness vs. measured use. The Technology Acceptance Model suggests that perceived usefulness and ease of use directly impact user behavior and system use.

Seddon, P.B. (1997). A respecification and extension of DeLone and McLean’s model of IS success. Information Systems Research, 8(September).

Standard variance models assert that variance in independent variables predicts variance in dependent variables. Process models, on the other hand, posit that not only are the occurrence of events necessary but that it is a particular sequence of events that leads to a change in the dependent variable. The presented IS success model removes the process component of the DeLone and McLean’s model. The problematic model contained three meanings of information system use. One meaning is that use provides some benefit to the user. A second, invalid, meaning presented use as a dependent variable of future use (i.e., if the user believes the system will be useful in the future, they will use it now). The third, also invalid, is that use is an event in the process that leads to individual or organizational impact. The proposed model links measures of system and information quality to perceived usefulness and user satisfaction, which in turn leads to expectations of future system usefulness and then use. Observing benefits to other individuals, organizations, and society also impact perceived usefulness and user satisfaction regardless of system or information quality.

Velasquez, N.F., Sabherwal, R., & Durcikova, A. (2011). Adoption of an electronic knowledge repository: A feature-based approach. Presented at 44th Hawaii International Conference on System Sciences, 4-7 January 2011, Kauai, HI.

This article discusses the types of use for knowledge base users. It utilized a cluster analysis to come up with three types of users. This included Enthusiastic Knowledge Seekers, Thoughtful Knowledge Providers, and Reluctant Non-adopters. Enthusiastic Knowledge Seekers made up the largest group at 70%. They had less knowledge and experience and shared little if anything of their own but considered the knowledgebase articles to be of high quality and very useful. The thoughtful knowledge providers, 19% of the users, submitted quality articles to the knowledgebase, enjoy sharing their knowledge with others, had moderate experience, and were intrinsically motivated. The smallest group, Reluctant Non-adopters at 11%, were experts who were highly experienced and adept at knowledge sharing but lacked the time or intrinsic motivation to do contribute meaningfully. They considered the knowledgebase to be low quality and did not consider it worth their time to work on improving it.

## Thursday, December 4, 2014

### Cluster Analysis and Special Probability Distributions - An Annotated Bibliography

Antonenko, P., Toy, S., & Niederhauser, D. (2012). Using cluster analysis for data mining in educational technology research. Educational Technology Research and Development, 60(3), 383-398.

Server log data from online learning environments can be analyzed to examine student behaviors, in terms of pages visited, length of time on a page, order of links clicked, and so on. This analysis is less cognitively taxing to the student than think aloud techniques and to the researcher since there is no coding of behaviors involved. Cluster analysis groups cases such that they are very similar within the cluster and dissimilar to other cases outside the cluster across target variables. It is related to factor analysis, where regression models are created based on a set of variables across cases, but in cluster analysis, cases are then grouped. Proximity indices (squared Euclidean distance or sum of the squared differences across variables) are calculated for every pair of cases. Squaring makes them all positive and accentuates the outliers. Various clustering algorithms are available to then group similar cases. Ward’s is a hierarchical clustering technique that combines cases one at a time from n clusters to 1 cluster and determines which minimizes the standard error, and is used when there is no preconceived idea about the likely number of clusters. Using k-means clustering, a non-hierarchical techniques, an empirical rationale for a predetermined number of clusters is tested. It may also be used when there is a large sample size in order to increase efficiency; if no empirical basis exists, the model is run on 3, 4, and 5 clusters. The method calculates k centroids and associates cases with the closest centroid, repeating until the standard error is minimized by allowing cases to move to a different centroid. It may also be possible to use two different kinds of techniques, for example, a Ward’s cluster analysis on a small sample followed by a k-means cluster analysis based on the findings from Ward’s. After determining the clusters, the characteristics of each cluster should be compared to ensure there is a meaningful difference among them and that there is a meaningful difference in the outcome based on their behaviors, since cluster analysis can find structures in data where none exists. ANOVA may then be used to determine for each cluster how much each variable contributes to variation in the dependent variable. It may be useful to use more than one technique and compare or average them, as different techniques may result in a variation in the results.

Bain, L.J. & Englehardt, M. (1991). Special probability distributions. In Introduction to probability and mathematical statistics (2nd Edition). Belmont, CA: Duxberry Press.

A Bernoulli trial has two discrete outcomes whose probabilities add up to 1. A series of independent Bernoulli trials forms a Binomial distribution, where the number of successes (or failures) are determined for n trials. A Hypergeometric distribution occurs when n samples are taken from a population of N+M without replacement. It can be useful for testing a batch of manufactured products for defects in order to accept or reject the batch. The Geometric Binomial distribution determines the minimum number of Bernoulli trials that must occur to achieve a success. The Negative Binomial distribution determines the minimum number of Bernoulli trials that must occur to achieve n successes. The Poisson distribution describes the probability of n independent successes occurring over a certain number of trials. The discrete uniform distribution allows for n possible values, each with equal probability of occurrence.

Blau, B.M., Brough, T.J., & Thomas, D.W. (2013). Corporate lobbying, political connections, and the bailout of banks. Unpublished manuscript, Department of Finance and Economics, Utah State University, Logan, UT.

When measuring a dependent variable with discrete values, an appropriate count regression framework must be used. Poisson, negative binomial, and OLS are possible models to use. Poisson regression uses a distribution where the mean is equal to its variance. If the distribution is over-dispersed or significantly greater than 0, Poisson will not work. No discussion of when negative binomial or OLS work.

Collins, L.M. & Lanza, S.T. (2010). Latent class and latent transition analysis for the social, behavioral, and health sciences. New York: Wiley. Latent variables are unobserved but predicted by the observation of multiple observed variables. The latent variable is presumed to cause the observed indicator variables. Different models are used, depending on whether the observed and latent variables are discrete or continuous. Using a discrete latent variable helps organize complex arrays of categorical data. A given construct may be measured using either continuous or discrete variables, so the method used when there is a choice should be based on which best helps address the research questions. When cases are placed into classes, the classes are named by the researcher based on their similar characteristics.

Fisher, W.D. (1958). On grouping for maximum homogeneity. Journal of the American Statistical Association, 53, 789-798.

Grouping or clustering is a useful tool for distinguishing sets of cases based either on prior theory of what the groups should entail or with no initial structure in mind. Combining the groups has a goal of minimizing the variance or error sum of squares. For some small cases, a visual inspection of data may allow the researcher to come up with the clusters. In large data sets with evenly dispersed data, this is difficult or impossible.

Francis, B. (2010). Latent class analysis methods and software. Presented at 4th Economic and Social Research Council Research Methods Festival, 5 - 8 July 2010, St. Catherine’s College, Oxford, UK.

Latent class cluster analysis assigns cases to groups based on statistical likelihood; they do not have to be assigned to discrete classes. K-means clustering is problematic, since the number of groups has to be specified a priori, cases are assigned to unique clusters, and only allows continuous data.

Gardner, W., Mulvey, E.P., & Shaw, E.C. (1995). Regression analyses of counts and rates: Poisson, overdispersed Poisson, and negative binomial models. Psychological Bulletin 118(3).

Researchers often use suboptimal strategies when analyzing count data, such as artificially breaking down counts into categories of 5 or 10, but this loses data and statistical power. Another ineffective strategy is to use regular linear regression or OLS. Using OLS, illogical values, such as negatives will be predicted, and the model’s variance of values around the mean is not likely to fit well. Another problem with OLS is heteroscedastic error terms, where larger values will have larger variances and smaller values small variances. Nonlinear models that allow for only positive values and describe likely dispersion about the mean must be used. Poisson places restrictive assumptions on the size of the variance. The Overdispersed Poisson model corrects for the large variances that are common. The negative binomial is another option. In the regular Poisson model, truncated extreme tail values could lead to underdispersion and a large number of high values could lead to overdispersion. An overdispersion parameter is calculated by dividing Pearson’s chi-squared by the degrees of freedom and then the overdisperson parameter is multiplied by the mean. The negative binomial model includes a random component that accounts for individual variances. The negative binomial model allows one to estimate the probability distribution, where the overdispersed Poisson does not.

Osgood, D.W. (2000). Poisson-based regression analysis of aggregate crime rates. Journal of Quantitative Criminology 16(1).

The normal approach to analyze per capita rates of occurrence is to use the OLS model. However, OLS does not provide an effective model when recording a small number of events. For large populations, OLS may work, but for a small number of events in a small population, the results is an overestimated rate of occurrence. Often small counts will be skewed with a floor of 0. The Poisson model corrects for many of these issues with OLS; however, the unlikely assumption of the Poisson’s mean equaling the variance must hold. Due to individual variations and correlation between observed values and variance, overdispersion is common. Adjusting the standard errors and thus t-test results for the overdispersion helps correct the model. The negative binomial model combines the Poisson distribution with a gamma distribution that accounts for unexplained variation.

Romesburg, H.C. (1990). Cluster Analysis for Researchers. Malabar, FL: Robert E. Krieger Publishing Company.

The steps in doing cluster analysis begin with creating the data matrix, including objects and their attributes. The objective is to determine which objects are most similar based on those attributes. An optional step is to standardize the data matrix. A resemblance matrix is then calculated, showing for each pair of objects a similarity coefficient, such as the Euclidean distance. Based on the similarity coefficient, a tree is created by combining similar objects and comparing their average to the other existing objects. Then rearrange objects in the data matrix to show the closest objects next to each other.

Velasquez, N.F., Sabherwal, R., & Durcikova, A. (2011). Adoption of an electronic knowledge repository: A feature-based approach. Presented at 44th Hawaii International Conference on System Sciences, 4-7 January 2011, Kauai, HI.

This article discusses the types of use for knowledge base users. It utilizes a cluster analysis to come up with three types of users. Clustering methods compared were Ward’s, between-groups linkage, within-groups linkage, centroid clustering, and median clustering and the one with the best fit was used.

Wang, W. & Famoye, F. (1997). Modeling household fertility decisions with generalized Poisson regression. Journal of Population Economics 10. Poisson and negative binomial models account for non-negative counts of discrete occurences. The Poisson model requires that the mean and variance of the dependent variable are equal, which is rarely true. This leads to a consistent model but invalid standard errors. The negative binomial model handles counts with overdispersion. When underdispersion is present, a generalized Poisson regression model may be used. Generalized Poisson handles both overdispersion and underdispersion.

Ward, J. H., Jr. (1963), Hierarchical Grouping to Optimize an Objective Function, Journal of the American Statistical Association, 48, 236–244.

Ward describes a clustering technique that allows for grouping with respect to many variables in such a way that minimizes the loss in each group. Traditional statistics would take a group of numbers, find the mean, and then calculate the error sum of squares for all cases and the one mean. By grouping, the ESS will be minimized as they are compared to the group means. The appropriate number of groups can be determined in the grouping process rather than needing to specify it in advance.

Server log data from online learning environments can be analyzed to examine student behaviors, in terms of pages visited, length of time on a page, order of links clicked, and so on. This analysis is less cognitively taxing to the student than think aloud techniques and to the researcher since there is no coding of behaviors involved. Cluster analysis groups cases such that they are very similar within the cluster and dissimilar to other cases outside the cluster across target variables. It is related to factor analysis, where regression models are created based on a set of variables across cases, but in cluster analysis, cases are then grouped. Proximity indices (squared Euclidean distance or sum of the squared differences across variables) are calculated for every pair of cases. Squaring makes them all positive and accentuates the outliers. Various clustering algorithms are available to then group similar cases. Ward’s is a hierarchical clustering technique that combines cases one at a time from n clusters to 1 cluster and determines which minimizes the standard error, and is used when there is no preconceived idea about the likely number of clusters. Using k-means clustering, a non-hierarchical techniques, an empirical rationale for a predetermined number of clusters is tested. It may also be used when there is a large sample size in order to increase efficiency; if no empirical basis exists, the model is run on 3, 4, and 5 clusters. The method calculates k centroids and associates cases with the closest centroid, repeating until the standard error is minimized by allowing cases to move to a different centroid. It may also be possible to use two different kinds of techniques, for example, a Ward’s cluster analysis on a small sample followed by a k-means cluster analysis based on the findings from Ward’s. After determining the clusters, the characteristics of each cluster should be compared to ensure there is a meaningful difference among them and that there is a meaningful difference in the outcome based on their behaviors, since cluster analysis can find structures in data where none exists. ANOVA may then be used to determine for each cluster how much each variable contributes to variation in the dependent variable. It may be useful to use more than one technique and compare or average them, as different techniques may result in a variation in the results.

Bain, L.J. & Englehardt, M. (1991). Special probability distributions. In Introduction to probability and mathematical statistics (2nd Edition). Belmont, CA: Duxberry Press.

A Bernoulli trial has two discrete outcomes whose probabilities add up to 1. A series of independent Bernoulli trials forms a Binomial distribution, where the number of successes (or failures) are determined for n trials. A Hypergeometric distribution occurs when n samples are taken from a population of N+M without replacement. It can be useful for testing a batch of manufactured products for defects in order to accept or reject the batch. The Geometric Binomial distribution determines the minimum number of Bernoulli trials that must occur to achieve a success. The Negative Binomial distribution determines the minimum number of Bernoulli trials that must occur to achieve n successes. The Poisson distribution describes the probability of n independent successes occurring over a certain number of trials. The discrete uniform distribution allows for n possible values, each with equal probability of occurrence.

Blau, B.M., Brough, T.J., & Thomas, D.W. (2013). Corporate lobbying, political connections, and the bailout of banks. Unpublished manuscript, Department of Finance and Economics, Utah State University, Logan, UT.

When measuring a dependent variable with discrete values, an appropriate count regression framework must be used. Poisson, negative binomial, and OLS are possible models to use. Poisson regression uses a distribution where the mean is equal to its variance. If the distribution is over-dispersed or significantly greater than 0, Poisson will not work. No discussion of when negative binomial or OLS work.

Collins, L.M. & Lanza, S.T. (2010). Latent class and latent transition analysis for the social, behavioral, and health sciences. New York: Wiley. Latent variables are unobserved but predicted by the observation of multiple observed variables. The latent variable is presumed to cause the observed indicator variables. Different models are used, depending on whether the observed and latent variables are discrete or continuous. Using a discrete latent variable helps organize complex arrays of categorical data. A given construct may be measured using either continuous or discrete variables, so the method used when there is a choice should be based on which best helps address the research questions. When cases are placed into classes, the classes are named by the researcher based on their similar characteristics.

Fisher, W.D. (1958). On grouping for maximum homogeneity. Journal of the American Statistical Association, 53, 789-798.

Grouping or clustering is a useful tool for distinguishing sets of cases based either on prior theory of what the groups should entail or with no initial structure in mind. Combining the groups has a goal of minimizing the variance or error sum of squares. For some small cases, a visual inspection of data may allow the researcher to come up with the clusters. In large data sets with evenly dispersed data, this is difficult or impossible.

Francis, B. (2010). Latent class analysis methods and software. Presented at 4th Economic and Social Research Council Research Methods Festival, 5 - 8 July 2010, St. Catherine’s College, Oxford, UK.

Latent class cluster analysis assigns cases to groups based on statistical likelihood; they do not have to be assigned to discrete classes. K-means clustering is problematic, since the number of groups has to be specified a priori, cases are assigned to unique clusters, and only allows continuous data.

Gardner, W., Mulvey, E.P., & Shaw, E.C. (1995). Regression analyses of counts and rates: Poisson, overdispersed Poisson, and negative binomial models. Psychological Bulletin 118(3).

Researchers often use suboptimal strategies when analyzing count data, such as artificially breaking down counts into categories of 5 or 10, but this loses data and statistical power. Another ineffective strategy is to use regular linear regression or OLS. Using OLS, illogical values, such as negatives will be predicted, and the model’s variance of values around the mean is not likely to fit well. Another problem with OLS is heteroscedastic error terms, where larger values will have larger variances and smaller values small variances. Nonlinear models that allow for only positive values and describe likely dispersion about the mean must be used. Poisson places restrictive assumptions on the size of the variance. The Overdispersed Poisson model corrects for the large variances that are common. The negative binomial is another option. In the regular Poisson model, truncated extreme tail values could lead to underdispersion and a large number of high values could lead to overdispersion. An overdispersion parameter is calculated by dividing Pearson’s chi-squared by the degrees of freedom and then the overdisperson parameter is multiplied by the mean. The negative binomial model includes a random component that accounts for individual variances. The negative binomial model allows one to estimate the probability distribution, where the overdispersed Poisson does not.

Osgood, D.W. (2000). Poisson-based regression analysis of aggregate crime rates. Journal of Quantitative Criminology 16(1).

The normal approach to analyze per capita rates of occurrence is to use the OLS model. However, OLS does not provide an effective model when recording a small number of events. For large populations, OLS may work, but for a small number of events in a small population, the results is an overestimated rate of occurrence. Often small counts will be skewed with a floor of 0. The Poisson model corrects for many of these issues with OLS; however, the unlikely assumption of the Poisson’s mean equaling the variance must hold. Due to individual variations and correlation between observed values and variance, overdispersion is common. Adjusting the standard errors and thus t-test results for the overdispersion helps correct the model. The negative binomial model combines the Poisson distribution with a gamma distribution that accounts for unexplained variation.

Romesburg, H.C. (1990). Cluster Analysis for Researchers. Malabar, FL: Robert E. Krieger Publishing Company.

The steps in doing cluster analysis begin with creating the data matrix, including objects and their attributes. The objective is to determine which objects are most similar based on those attributes. An optional step is to standardize the data matrix. A resemblance matrix is then calculated, showing for each pair of objects a similarity coefficient, such as the Euclidean distance. Based on the similarity coefficient, a tree is created by combining similar objects and comparing their average to the other existing objects. Then rearrange objects in the data matrix to show the closest objects next to each other.

Velasquez, N.F., Sabherwal, R., & Durcikova, A. (2011). Adoption of an electronic knowledge repository: A feature-based approach. Presented at 44th Hawaii International Conference on System Sciences, 4-7 January 2011, Kauai, HI.

This article discusses the types of use for knowledge base users. It utilizes a cluster analysis to come up with three types of users. Clustering methods compared were Ward’s, between-groups linkage, within-groups linkage, centroid clustering, and median clustering and the one with the best fit was used.

Wang, W. & Famoye, F. (1997). Modeling household fertility decisions with generalized Poisson regression. Journal of Population Economics 10. Poisson and negative binomial models account for non-negative counts of discrete occurences. The Poisson model requires that the mean and variance of the dependent variable are equal, which is rarely true. This leads to a consistent model but invalid standard errors. The negative binomial model handles counts with overdispersion. When underdispersion is present, a generalized Poisson regression model may be used. Generalized Poisson handles both overdispersion and underdispersion.

Ward, J. H., Jr. (1963), Hierarchical Grouping to Optimize an Objective Function, Journal of the American Statistical Association, 48, 236–244.

Ward describes a clustering technique that allows for grouping with respect to many variables in such a way that minimizes the loss in each group. Traditional statistics would take a group of numbers, find the mean, and then calculate the error sum of squares for all cases and the one mean. By grouping, the ESS will be minimized as they are compared to the group means. The appropriate number of groups can be determined in the grouping process rather than needing to specify it in advance.

## Friday, November 21, 2014

### Instructional Design - An Annotated Bibliography

Anderson, L.W. & Krathwohl, D.R. (2001). A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives. New York: Longman.

Instructional objectives are the exposure to terminology and scaffolded practice that build towards educational objectives. Educational objectives are the mid-level competency, which is the level at which assessment should culminate. Global objectives are the connection to the workplace – what a professional in the field should be able to do, or at a minimum, the big picture goal of a degree program. It may be described in the other direction as a cognitive mapping process where global objectives are deconstructed into educational objectives and educational objectives into instructional objectives utilizing a template representing the six levels of the taxonomy: Remember, Understand, Apply, Analyze, Evaluate, and Create. Objectives are what a student should be able to do but not a specific instructional activity or assessment activity.

Dupin-Bryant, P.A. & DuCharme-Hansen, B.A. (2005). Assessing Student Needs in Web-Based Distance Education. International Journal of Instructional Technology & Distance Learning 2(1).

Student needs assessment helps the instructor plan to facilitate a course learning experience. Learning objectives may or may not already be in place when the needs assessment is carried out, but the needs assessment will help refine the instructional objectives that need to be included to determine where to start. Areas to assess student needs include: computer skills, learning styles, available resources, desired outcomes, and prior experience. Computer literacy may be taught to the entire class or just the group that needs it or integrated into other learning activities. There is a larger debate on the usefulness of learning styles inventories, but the important concept is to ensure a variety of types of content and activities are provided. Available resources are probably more important in web-based education but important in any environment – do students have the hardware, software, internet access, or otherwise to be able to participate fully in all class activities? Course objectives are one thing, but students may be looking to get something else out of the class. Always build on what the students have previous learned, whether from previous classes or from experience in the workplace.

Fisher, D. H. (2012). Warming up to MOOCs. The Chronicle of Higher Education. Retrieved March 19, 2013 from http://chronicle.com/blogs/profhacker/warming-up-to-moocs.

Hesitation to use materials available from other instructors in one’s own classroom may be due to insecurity around what others will think about using outsourced lectures and what to do with class time instead of lecturing. The author used the MOOC content as homework assignments, flipping the classroom to then allow for higher level discussions of the material, instead of just presentation of the material. By utilizing materials of other faculty and contributing back, the community of scholarship extends from the research component of the faculty role to include teaching, which is often ignored.

Fusch, D. (2012). Course materials for mobile devices: Key considerations. Higher Ed Impact. Retrieved March 19, 2012, from Academic Impressions http://www.academicimpressions.com/news/course-materials-mobile-devices-key-considerations.

People spend as much time reading on a digital screen as they do reading paper. The amount of content read through mobile devices will soon surpass what is read on a full size computer. Faculty need to consider the usability and accessibility of the learning resources they assign to ensure they can be effectively used on mobile devices. By assuming it will be accessed on a mobile device first, it’s easier to move from mobile to desktop than the other way around. Record short videos, don’t use PDFs, and break up the readings into smaller chunks. Copyright and licensing considerations are important, as different licensing may apply in the mobile realm. Using open content is one way to ensure it can be ported to other platforms.

Kirschner, P. A., Sweller, J., & Clark, R. E. (2006). Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching. Educational Psychologist, 41(2), 75-86.

One school of thought is that students learn by discovering or constructing concepts themselves, while another school says that direct explanation of concepts and related metacognition is the most effective. The assumptions at play in the minimal guidance approach are that experiential, self-constructed learning is best. The authors put forward that an understanding of human cognitive architecture is needed to determine the most effective instructional methods. They explain that long term memory permits experts to pull from vast experience to recognize a situation and know what to do. Given the limitations of working memory, the authors explain that minimally guided instruction taxes the working memory while pushing little into long term memory. They quote Shulman’s explanation of the difference between content knowledge and pedagogical content knowledge and curricular knowledge (metacognition) as well as how one works in the field (epistemology). Studying worked examples reduces cognitive load for novice learners that are prepared appropriately for them. PBL research shows little or no gain in problem solving ability.

Reed, S. K. (2006). Cognitive architectures for multimedia learning. Educational Psychologist, 41(2), 87-98.

Six theories of multimedia learning are reviewed, the first three being memory studies and the last three in instructional contexts. Paivio’s Dual Coding Theory: visual imagery is an important method of coding concepts; dual coding refers to the use of both verbal coding and visual coding of semantic meaning. Baddeley’s Working Memory Model: verbal and visual coding, with a verbal focus on phonological learning; includes the need for a “central executive” that the learner uses to guide what modality to use in the moment; author later adds to the model the episodic buffer, where information from various modalities can be combined. Engelkamp’s Multimodal Theory: acting out what is being learned results in greater recall, as action implies understanding, assuming the action is relevant to the semantics. Sweller’s Cognitive Load Theory: Multimedia design may decrease extraneous cognitive load by integrating information that needs to be presented together, worked examples, and schemas; split-attention and redundancy effects are described. Mayer’s Multimedia Theory: utilizes recommendations from other models; seven principles – multimedia (learn better from pictures and words together), spatial contiguity (corresponding words and pictures should be close to each other), temporal contiguity (words and pictures should be presented simultaneously), coherence (extraneous words, pictures, and sounds should be excluded), modality (animation + narration > animation + text), redundancy (animation + narration > animation + narration + text), and individual differences (low-knowledge learners and high-spatial learners are more affected by multimedia presentation). Nathan’s ANIMATE Theory: visual representation through simulation to help model the solution to a problem.

Renninger, K. A. (2009). Interest and identity development in instruction: An inductive model. Educational Psychologist, 44(2), 105-118.

Instructional model that states both the interest and identity of a student are important to consider in developing learning activities. Interest relates to the desire of a learner to reengage with particular content after a previous experience. Identity is the learner’s self-representation as someone who engages with particular content. Interest needs to be cultivated and sustained throughout all stages of individual development. Interest requires some understanding of what it takes to engage, not just a baseless sense of euphoria around an interesting topic. It’s also important to consider interest in achievement, as opposed to interest in the actual content. Identity changes as learners mature and come to understand how much work is required to accomplish required education levels and goals.

Shute, V., Towle, B. (2003) Adaptive E-Learning. Educational Psychologist 38(2).

Early iterations of e-learning were concerned with simply getting information online, but the focus now is on improving learning and performance. In order to be the most effective for each individual learner, the characteristics of each learner should be assessed. One behavior a system should encourage is exploration, as students who explore and participate in optional material tend to perform better on assessments. The learner model represents what the individual learner knows, and the instructional model presents and assesses content in the most appropriate way. Adaptive e-learning should provide, not the same textbook in a scrolling page instead of a physical page, but rather dynamically ordered and filtered pages to present learners just what they need right when they need it.

Smith, L. (2003). Software Design. In Guidelines for Successful Acquisition and Management of Software Intensive Systems (4th ed.). Hill Air Force Base, UT: U.S. Air Force Software Technology Support Center.

Given the complex nature of programming, design is the key phase between gathering requirements and actual development. Design includes several iterations, including functional design (logic, desired outputs, rules, data organization, and user interface), system design (system specifications, software structure, security, and programming standards), and program design (software units, test plans, user documentation, install plan, training plan, and programmer manual). Design methods include structured design (functions and subroutines with an order), object oriented design (objects inherit from parents, changes can be pushed out to many related objects, and specifics about what happen in each object are not as important), and extreme programming (frequent code review, testing, and release iterations).

Stiggins, R. & DuFour, R. (2009). Maximizing the Power of Formative Assessments. Phi Delta Kappan 90(9). Retrieved January 31, 2013 from http://alaskacc.org/sites/alaskacc.org/files/Stiggins%20article.pdf.

Formative assessment helps to track individual student achievement and performance and drive continuous improvement. Common assessments are created by multiple faculty members teaching the same course. Common formative assessments can result in significant improvement of learning if they are specifically integrated into instructional decision making, high quality, and used to benefit student learning. Assessments may be at the classroom, school, or institutional level. No matter the level or how they are used, assessments need clear learning targets that are appropriately scaffolded and achievable, based on established standards, high quality and high fidelity, and are available in a timely and understandable form in order to help the learner do better next time. Common formative assessments can be used to determine how an individual student is doing as well as to compare classroom performance. The act of putting together a common assessment allows the conversation to happen regarding what is truly important to measure. The greater dialogue among faculty members results in a higher quality assessment than what any individual teacher might be able to create.

van MerriÃ«nboer, J. J. G., & Sweller, J. (2005). Cognitive load theory and complex learning: Recent developments and future directions. Educational Psychology Review, 17(2), 147-177.

When introduced, cognitive load theory led to new types of instructional methods, such as providing many worked examples instead of problems to solve. The theory posited that long term memory is made up of schemas, which make meaning out of complex structures. Working memory is limited when dealing with new content but not limited when working with schemas from long term memory. Cognitive load theory deals with the processing of content in working memory to create schemas stored in long term memory. In order to take a dynamic approach, where instruction is automatically tailored to the learner, the knowledge of learners must be assessed and methods of promoting effective instruction for each group of learners are needed. Assessment should include ability to generate correct responses as well as the mental effort required to accomplish that.

White, B., Frederiksen, J. (2005). A theoretical framework and approach for fostering metacognitive development. Educational Psychologist, 40(4).

Metacognition is crucial in helping individuals learn through inquiry and work together in teams. Understanding how to use inquiry learning will help the learner be more effective in using inquiry learning strategies. Inquiry then includes inquiry about inquiry and inquiry about the domain of study. Advisors to help manage metacognition may be automated tutors or other learners.

Wiggins, G. & McTighe, J. (2006). Backward Design. In Understanding by Design (Expanded 2nd Edition). Upper Saddle Hill, NJ: Pearson.

When planning curriculum, it is important to pay attention to local and national standards but also to determining the specific understandings that are desired. The focus should be on learning rather than on teaching. By determining a larger purpose first, the best resources and activities can be selected to achieve the goal. Traditional design falls into the trap of providing interesting experiences that do not lead to specific desired achievements or that briefly cover all content without touching on any with enough depth to be meaningful. Start with desired goals and ask what appropriate evidence of achievement would look like and likewise what the assessment should consist of. Only after determining what the desired results are and what an appropriate assessment would look like can learning experiences be planned. Some sacred cows may be harmed in the process of ensuring all activities have a specific purpose with effectiveness in reaching our targets in mind. The textbook may become less important than its significant role in many classrooms.

Instructional objectives are the exposure to terminology and scaffolded practice that build towards educational objectives. Educational objectives are the mid-level competency, which is the level at which assessment should culminate. Global objectives are the connection to the workplace – what a professional in the field should be able to do, or at a minimum, the big picture goal of a degree program. It may be described in the other direction as a cognitive mapping process where global objectives are deconstructed into educational objectives and educational objectives into instructional objectives utilizing a template representing the six levels of the taxonomy: Remember, Understand, Apply, Analyze, Evaluate, and Create. Objectives are what a student should be able to do but not a specific instructional activity or assessment activity.

Dupin-Bryant, P.A. & DuCharme-Hansen, B.A. (2005). Assessing Student Needs in Web-Based Distance Education. International Journal of Instructional Technology & Distance Learning 2(1).

Student needs assessment helps the instructor plan to facilitate a course learning experience. Learning objectives may or may not already be in place when the needs assessment is carried out, but the needs assessment will help refine the instructional objectives that need to be included to determine where to start. Areas to assess student needs include: computer skills, learning styles, available resources, desired outcomes, and prior experience. Computer literacy may be taught to the entire class or just the group that needs it or integrated into other learning activities. There is a larger debate on the usefulness of learning styles inventories, but the important concept is to ensure a variety of types of content and activities are provided. Available resources are probably more important in web-based education but important in any environment – do students have the hardware, software, internet access, or otherwise to be able to participate fully in all class activities? Course objectives are one thing, but students may be looking to get something else out of the class. Always build on what the students have previous learned, whether from previous classes or from experience in the workplace.

Fisher, D. H. (2012). Warming up to MOOCs. The Chronicle of Higher Education. Retrieved March 19, 2013 from http://chronicle.com/blogs/profhacker/warming-up-to-moocs.

Hesitation to use materials available from other instructors in one’s own classroom may be due to insecurity around what others will think about using outsourced lectures and what to do with class time instead of lecturing. The author used the MOOC content as homework assignments, flipping the classroom to then allow for higher level discussions of the material, instead of just presentation of the material. By utilizing materials of other faculty and contributing back, the community of scholarship extends from the research component of the faculty role to include teaching, which is often ignored.

Fusch, D. (2012). Course materials for mobile devices: Key considerations. Higher Ed Impact. Retrieved March 19, 2012, from Academic Impressions http://www.academicimpressions.com/news/course-materials-mobile-devices-key-considerations.

People spend as much time reading on a digital screen as they do reading paper. The amount of content read through mobile devices will soon surpass what is read on a full size computer. Faculty need to consider the usability and accessibility of the learning resources they assign to ensure they can be effectively used on mobile devices. By assuming it will be accessed on a mobile device first, it’s easier to move from mobile to desktop than the other way around. Record short videos, don’t use PDFs, and break up the readings into smaller chunks. Copyright and licensing considerations are important, as different licensing may apply in the mobile realm. Using open content is one way to ensure it can be ported to other platforms.

Kirschner, P. A., Sweller, J., & Clark, R. E. (2006). Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching. Educational Psychologist, 41(2), 75-86.

One school of thought is that students learn by discovering or constructing concepts themselves, while another school says that direct explanation of concepts and related metacognition is the most effective. The assumptions at play in the minimal guidance approach are that experiential, self-constructed learning is best. The authors put forward that an understanding of human cognitive architecture is needed to determine the most effective instructional methods. They explain that long term memory permits experts to pull from vast experience to recognize a situation and know what to do. Given the limitations of working memory, the authors explain that minimally guided instruction taxes the working memory while pushing little into long term memory. They quote Shulman’s explanation of the difference between content knowledge and pedagogical content knowledge and curricular knowledge (metacognition) as well as how one works in the field (epistemology). Studying worked examples reduces cognitive load for novice learners that are prepared appropriately for them. PBL research shows little or no gain in problem solving ability.

Reed, S. K. (2006). Cognitive architectures for multimedia learning. Educational Psychologist, 41(2), 87-98.

Six theories of multimedia learning are reviewed, the first three being memory studies and the last three in instructional contexts. Paivio’s Dual Coding Theory: visual imagery is an important method of coding concepts; dual coding refers to the use of both verbal coding and visual coding of semantic meaning. Baddeley’s Working Memory Model: verbal and visual coding, with a verbal focus on phonological learning; includes the need for a “central executive” that the learner uses to guide what modality to use in the moment; author later adds to the model the episodic buffer, where information from various modalities can be combined. Engelkamp’s Multimodal Theory: acting out what is being learned results in greater recall, as action implies understanding, assuming the action is relevant to the semantics. Sweller’s Cognitive Load Theory: Multimedia design may decrease extraneous cognitive load by integrating information that needs to be presented together, worked examples, and schemas; split-attention and redundancy effects are described. Mayer’s Multimedia Theory: utilizes recommendations from other models; seven principles – multimedia (learn better from pictures and words together), spatial contiguity (corresponding words and pictures should be close to each other), temporal contiguity (words and pictures should be presented simultaneously), coherence (extraneous words, pictures, and sounds should be excluded), modality (animation + narration > animation + text), redundancy (animation + narration > animation + narration + text), and individual differences (low-knowledge learners and high-spatial learners are more affected by multimedia presentation). Nathan’s ANIMATE Theory: visual representation through simulation to help model the solution to a problem.

Renninger, K. A. (2009). Interest and identity development in instruction: An inductive model. Educational Psychologist, 44(2), 105-118.

Instructional model that states both the interest and identity of a student are important to consider in developing learning activities. Interest relates to the desire of a learner to reengage with particular content after a previous experience. Identity is the learner’s self-representation as someone who engages with particular content. Interest needs to be cultivated and sustained throughout all stages of individual development. Interest requires some understanding of what it takes to engage, not just a baseless sense of euphoria around an interesting topic. It’s also important to consider interest in achievement, as opposed to interest in the actual content. Identity changes as learners mature and come to understand how much work is required to accomplish required education levels and goals.

Shute, V., Towle, B. (2003) Adaptive E-Learning. Educational Psychologist 38(2).

Early iterations of e-learning were concerned with simply getting information online, but the focus now is on improving learning and performance. In order to be the most effective for each individual learner, the characteristics of each learner should be assessed. One behavior a system should encourage is exploration, as students who explore and participate in optional material tend to perform better on assessments. The learner model represents what the individual learner knows, and the instructional model presents and assesses content in the most appropriate way. Adaptive e-learning should provide, not the same textbook in a scrolling page instead of a physical page, but rather dynamically ordered and filtered pages to present learners just what they need right when they need it.

Smith, L. (2003). Software Design. In Guidelines for Successful Acquisition and Management of Software Intensive Systems (4th ed.). Hill Air Force Base, UT: U.S. Air Force Software Technology Support Center.

Given the complex nature of programming, design is the key phase between gathering requirements and actual development. Design includes several iterations, including functional design (logic, desired outputs, rules, data organization, and user interface), system design (system specifications, software structure, security, and programming standards), and program design (software units, test plans, user documentation, install plan, training plan, and programmer manual). Design methods include structured design (functions and subroutines with an order), object oriented design (objects inherit from parents, changes can be pushed out to many related objects, and specifics about what happen in each object are not as important), and extreme programming (frequent code review, testing, and release iterations).

Stiggins, R. & DuFour, R. (2009). Maximizing the Power of Formative Assessments. Phi Delta Kappan 90(9). Retrieved January 31, 2013 from http://alaskacc.org/sites/alaskacc.org/files/Stiggins%20article.pdf.

Formative assessment helps to track individual student achievement and performance and drive continuous improvement. Common assessments are created by multiple faculty members teaching the same course. Common formative assessments can result in significant improvement of learning if they are specifically integrated into instructional decision making, high quality, and used to benefit student learning. Assessments may be at the classroom, school, or institutional level. No matter the level or how they are used, assessments need clear learning targets that are appropriately scaffolded and achievable, based on established standards, high quality and high fidelity, and are available in a timely and understandable form in order to help the learner do better next time. Common formative assessments can be used to determine how an individual student is doing as well as to compare classroom performance. The act of putting together a common assessment allows the conversation to happen regarding what is truly important to measure. The greater dialogue among faculty members results in a higher quality assessment than what any individual teacher might be able to create.

van MerriÃ«nboer, J. J. G., & Sweller, J. (2005). Cognitive load theory and complex learning: Recent developments and future directions. Educational Psychology Review, 17(2), 147-177.

When introduced, cognitive load theory led to new types of instructional methods, such as providing many worked examples instead of problems to solve. The theory posited that long term memory is made up of schemas, which make meaning out of complex structures. Working memory is limited when dealing with new content but not limited when working with schemas from long term memory. Cognitive load theory deals with the processing of content in working memory to create schemas stored in long term memory. In order to take a dynamic approach, where instruction is automatically tailored to the learner, the knowledge of learners must be assessed and methods of promoting effective instruction for each group of learners are needed. Assessment should include ability to generate correct responses as well as the mental effort required to accomplish that.

White, B., Frederiksen, J. (2005). A theoretical framework and approach for fostering metacognitive development. Educational Psychologist, 40(4).

Metacognition is crucial in helping individuals learn through inquiry and work together in teams. Understanding how to use inquiry learning will help the learner be more effective in using inquiry learning strategies. Inquiry then includes inquiry about inquiry and inquiry about the domain of study. Advisors to help manage metacognition may be automated tutors or other learners.

Wiggins, G. & McTighe, J. (2006). Backward Design. In Understanding by Design (Expanded 2nd Edition). Upper Saddle Hill, NJ: Pearson.

When planning curriculum, it is important to pay attention to local and national standards but also to determining the specific understandings that are desired. The focus should be on learning rather than on teaching. By determining a larger purpose first, the best resources and activities can be selected to achieve the goal. Traditional design falls into the trap of providing interesting experiences that do not lead to specific desired achievements or that briefly cover all content without touching on any with enough depth to be meaningful. Start with desired goals and ask what appropriate evidence of achievement would look like and likewise what the assessment should consist of. Only after determining what the desired results are and what an appropriate assessment would look like can learning experiences be planned. Some sacred cows may be harmed in the process of ensuring all activities have a specific purpose with effectiveness in reaching our targets in mind. The textbook may become less important than its significant role in many classrooms.

## Friday, October 31, 2014

### Prioritization

Let's take a look at this importance-urgency matrix from Dr. Steven Covey. He talks about the need to prioritize your activities in order to manage your time effectively. You can actually keep a log of your activities throughout the day and categorize them in terms of how urgent and important they are, and you may be surprised at where you spend most of your time. People often claim they don't have time to do certain important items like planning and building important relationships, because they are always just putting out fires. It's the important/urgent items that demand our attention immediately. In between all the fires, we have all sorts of other small activities that fill in the rest of our time, but these are often unimportant items that are either forced on us by others or personal preferences and obsessions.

The trick is to prioritize properly. By focusing attention on the important but not urgent items, such as strategic planning and building key relationships in accordance with your strategy, the fires will actually put themselves out. If you have a good relationship with a customer, they'll understand when one order doesn't come through right, so while you need to fix it, it's not really as much of a fire as if you had to be worried about losing the account altogether. On the other hand, you might have a customer that doesn't fit your target demographic, who causes problems, and who you don't make much money on anyway. If you can make the strategic decision to drop that customer on whose fires you're wasting a lot of time and energy, you may come out ahead, because you can focus that attention on opportunities that will provide a better return on investment. In order to have the time to focus on strategic activities, you have to eliminate the unimportant activities that don't serve a greater purpose. Eliminate or shorten some meetings; set a schedule to check email once every few hours instead of letting it distract you as it comes in; stop creating reports that you think others need but they don't actually even look at. For an IT department, focusing on strategic aspects of the system infrastructure will help ensure projects are rolled out in a way that makes sense to support the company and possibly even utilize technology to drive new business opportunities.

The SWOT Analysis is an example of a Quadrant II activity that helps you understand where you should be focusing your time in order to be the most effective. A SWOT Analysis doesn't need to be overly structured or complicated. Spending a lot of time building a pretty SWOT template and training everyone on its use would be a good example of an unimportant activity. Put it out there and let it happen, whether you use a 2x2 matrix, bulleted lists, or more of a free-form mind map. The strengths and weaknesses are inward facing. They refer to inherent qualities of the company or department and what they're currently doing. Opportunities and threats are outward facing. They are qualities of the environment, actions of competitors, or imminent events that will have an effect on you. The goal is to build on strengths and take advantage of opportunities, while eliminating weaknesses and preventing threats from knocking you down.

In order to get a handle on where to focus attention, after brainstorming, it's important to group and rank the items you have listed. Provide additional details to determine the size of the threat or the amount of money an opportunity may be worth to you. Often there are connections between the internal and external analysis. You can leverage your strengths to take advantage of opportunities and avoid threats. Overcoming a weakness may open up new opportunities. So draw those connections and quantify each aspect of your analysis, but keep your analysis simple and visual. Keeping it all on one page will allow you and others to see how all the parts tie together. Provide additional information as a separate write-up and attach it on following pages. Of course, as you begin making decisions on what to focus on, you will come up with a more detailed plan, which is great, but the initial analysis should remain simple and understandable by anyone who picks it up.

People usually like showing off their good side, so it is easy to list the strengths, however realistic they actually are. It's more difficult for managers to get honest answers from their employees about real weaknesses and threats, so it is important to create a safe place when brainstorming the more negative aspects. Here's where having done your relationship building with your team will allow there to be enough trust to do this legitimately. You might have to use a technology solution to allow team members to submit weaknesses and threats anonymously if you don't have the trust to do so face to face. Being aware of and honest about your challenges can show as much or more strength as listing out what your strengths are.

The trick is to prioritize properly. By focusing attention on the important but not urgent items, such as strategic planning and building key relationships in accordance with your strategy, the fires will actually put themselves out. If you have a good relationship with a customer, they'll understand when one order doesn't come through right, so while you need to fix it, it's not really as much of a fire as if you had to be worried about losing the account altogether. On the other hand, you might have a customer that doesn't fit your target demographic, who causes problems, and who you don't make much money on anyway. If you can make the strategic decision to drop that customer on whose fires you're wasting a lot of time and energy, you may come out ahead, because you can focus that attention on opportunities that will provide a better return on investment. In order to have the time to focus on strategic activities, you have to eliminate the unimportant activities that don't serve a greater purpose. Eliminate or shorten some meetings; set a schedule to check email once every few hours instead of letting it distract you as it comes in; stop creating reports that you think others need but they don't actually even look at. For an IT department, focusing on strategic aspects of the system infrastructure will help ensure projects are rolled out in a way that makes sense to support the company and possibly even utilize technology to drive new business opportunities.

The SWOT Analysis is an example of a Quadrant II activity that helps you understand where you should be focusing your time in order to be the most effective. A SWOT Analysis doesn't need to be overly structured or complicated. Spending a lot of time building a pretty SWOT template and training everyone on its use would be a good example of an unimportant activity. Put it out there and let it happen, whether you use a 2x2 matrix, bulleted lists, or more of a free-form mind map. The strengths and weaknesses are inward facing. They refer to inherent qualities of the company or department and what they're currently doing. Opportunities and threats are outward facing. They are qualities of the environment, actions of competitors, or imminent events that will have an effect on you. The goal is to build on strengths and take advantage of opportunities, while eliminating weaknesses and preventing threats from knocking you down.

In order to get a handle on where to focus attention, after brainstorming, it's important to group and rank the items you have listed. Provide additional details to determine the size of the threat or the amount of money an opportunity may be worth to you. Often there are connections between the internal and external analysis. You can leverage your strengths to take advantage of opportunities and avoid threats. Overcoming a weakness may open up new opportunities. So draw those connections and quantify each aspect of your analysis, but keep your analysis simple and visual. Keeping it all on one page will allow you and others to see how all the parts tie together. Provide additional information as a separate write-up and attach it on following pages. Of course, as you begin making decisions on what to focus on, you will come up with a more detailed plan, which is great, but the initial analysis should remain simple and understandable by anyone who picks it up.

People usually like showing off their good side, so it is easy to list the strengths, however realistic they actually are. It's more difficult for managers to get honest answers from their employees about real weaknesses and threats, so it is important to create a safe place when brainstorming the more negative aspects. Here's where having done your relationship building with your team will allow there to be enough trust to do this legitimately. You might have to use a technology solution to allow team members to submit weaknesses and threats anonymously if you don't have the trust to do so face to face. Being aware of and honest about your challenges can show as much or more strength as listing out what your strengths are.

## Tuesday, September 30, 2014

### Introvert's Dream

A colleague of mine was invited to a little virtual coffee break get-together for others who had been hired around the same time as him. There were several conversation starters sent out beforehand. I wasn't invited, so I don't know how much they stuck to the script or talked about other things.

But one of the questions spoke to me:

You are stuck on a desert island and you can only bring one song, one movie and one type of food. What would you bring?Keep your song and movie if I can have pizza from Sacco's and a promise that you're not pulling my leg about the desert island thing.

If I really do get a song, too, it would be one of Clapton's hour-long jams.

## Monday, August 4, 2014

### Simplification

As IT is integrated into more and more aspects of our lives at work, home, and everywhere in between, the need to make all the varying systems around us work together seamlessly leads to increased complexity. But more complexity means more cost and more likelihood of downtime. The article linked below discusses the importance of keeping it simple and provides some basic principles to keep in mind to make your organization more flexible and keeping it simple at the same time. The points in their simplification roadmap are to start at the top, use an entrepreneurial approach, use cloud services when available, and be agile. By having buy-in at all levels and focusing on adaptability, you can focus on the unique value you add rather than wasting time running around trying to reinvent the wheel or maintain the status quo.

http://www.cio.com/article/2451671/it-strategy/simplifying-it-pays-off-with-big-savings-better-business-success.html

http://www.cio.com/article/2451671/it-strategy/simplifying-it-pays-off-with-big-savings-better-business-success.html

## Thursday, July 17, 2014

### Technology Rights

A recent court case in Europe has highlighted a right that many would not immediately list in the top rights most important to them - the right to be forgotten. Privacy expectations in Europe are different from the United States, as Google found out as it took pictures all over Germany for its popular Street View service. But what about the right to have links removed which refer to old newspaper articles about something that happened a decade or two ago? It happened. There was a newspaper article about it. It's public information. Things change over time, and it's old news, but should the original articles still be searchable? Technology is an enabler. It helps you do what you want bigger and faster than you could without it. But that doesn't mean you can always control it. The man suing for removal of a past legal issue now shows up in more search results than he did before, magnifying the discussion around him. So how do you effectively leverage technology to magnify the positive and manage the negative without it getting out of control on you? That's the tough question to ask in your organization.

More on the Right to be Forgotten:

http://www.legalweek.com/legal-week/blog-post/2346341/the-right-to-be-forgotten-case-google-right-this-time-ecj-hopelessly-wrong

http://www.computerworld.com/s/article/9249793/Microsoft_offers_European_Bing_users_the_right_to_be_forgotten_

More on the Right to be Forgotten:

http://www.legalweek.com/legal-week/blog-post/2346341/the-right-to-be-forgotten-case-google-right-this-time-ecj-hopelessly-wrong

http://www.computerworld.com/s/article/9249793/Microsoft_offers_European_Bing_users_the_right_to_be_forgotten_

Subscribe to:
Posts (Atom)