Acquiring New Technology? Build-versus-Buy is Dead

Still debating the build-versus-buy decision at your organization for your IT purchases?  If so, you probably aren’t getting the biggest bang for your IT dollar: Build-versus-buy is dead.  For better decision-making when acquiring IT systems, forget build-versus-buy and remember the Technology Acquisition Grid.  You’ll not only save money, you’ll make smarter decisions for your organization long term, increasing your agility and speeding time-to-market.

In this article, I describe Software-as-a-Service (SaaS), application hosting, virtualization and cloud computing for the benefit of CEO’s, CFO’s, VP’s and other organization leaders outside of IT who often need to weigh in on the these key new technologies.  I also describe how these new approaches have changed technology acquisition for the better – from the old build-versus-buy decision, to the Technology Acquisition Grid. Along the way, you’ll learn some of the factors that will help you decide among the various options, saving your organization time and money.

The Old Model: Build-versus-Buy

When I earned my MBA in Information Systems in the mid-1990’s, more than one professor noted that the build-versus-buy decision was a critical one because it represented two often-costly and divergent paths.  In that model, the decision to “build” a new system from scratch gave the advantage of controlling the destiny of the system, including every feature and function.  In contrast, the “buy” decision to purchase a system created by a supplier (vendor) brought the benefit of reduced cost and faster delivery because the supplier built the product in advance for many companies, then shared the development costs across multiple customers.

Back then, we thought of build versus buy as an either-or decision, like an on-off switch, something like this:

build-versus-buy-switch

In the end, the build-versus-buy decision was so critical because, for the most part, once you made the decision to build or buy, there was no turning back.  The costs of backpedaling were simply too high.

The Advent of Application Hosting, Virtualization, SaaS and Cloud Computing

During the 2000’s, innovations like application hosting, virtualization, software-as-a-service (SaaS) and cloud computing changed IT purchasing entirely, from traditional build-versus-buy, to a myriad of hosting and ownership options that reduce costs and speed time-to-market.  Now, instead of resembling an on-off switch, the acquisition decision started to look more like a sliding dimmer switch on a light, like this:

 

build-to-buy-slider

Suddenly, there were more combinations of options, giving organizations better control of their budgets and the timeline for delivering new information systems.

What are each of these technologies and how do they affect IT purchasing?  Here’s a brief description of each:

Application Hosting

During the dot-com era, a plethora of application-service-providers (ASPs) sprung up with a new business model.  They would go out and buy used software licenses, then host the software at their own facilities, leasing the licenses to their customers on a monthly basis.   The customers of ASPs benefit from the lower cost-of-ownership and reduced strain on IT staff to maintain yet another system, while the ASPs made money by pooling licenses across customers and making use of often-idle software licenses.

While the dot-com bust put quite a few ASPs out of business, the application hosting model, where the software runs on hardware supported by a hosting company and customers pay monthly or yearly fees to use the software, still survives today.

Virtualization

One of the first technologies to change the build-versus-buy decision was virtualization. By separating the hardware from the software, virtualization separates the decision to buy from the need for new software.  In virtualization, first, computer hardware is purchased to support the organization’s overall technology needs.  Then, a self-contained version of a machine – a “virtual” machine – is installed on the hardware, along with application software, such as supply chain or human resources software, that the business needs at that point in time.

When the organization needs a new software application that is not compatible with the first application, because it runs on another operating system, they install another virtual machine and another application on the same hardware.  By doing this, the organization not only delivers software applications more quickly because it doesn’t need to buy, install and configure hardware for every application, the organization also spends less on hardware, because it can add virtual machines to take advantage of unused processing power on the hardware.

Even better, virtual machines can be moved from one piece of hardware to another relatively easily, so like a hermit crab outgrowing its shell, applications can be moved to new hardware in hours or days instead of weeks or months.

Software-as-a-Service (SaaS)

Like virtualization, Software-as-a-Service, or SaaS, reduces the costs and time required to deliver new software applications.  In the most common approach to SaaS, the customer pays a monthly subscription fee to the software supplier based on the number of users on the customer’s staff during a given month.  As an added twist, the supplier hosts the software at their facilities, providing hardware and technical support, all within the monthly fee.  So, as long as a reliable Internet connection can be maintained between the customer and the SaaS supplier, the cost and effort to support and maintain the system are minimal.  The customer spends few resources and worries little about the software (assuming the SaaS supplier holds their side of the bargain), enabling the organization to focus on serving it’s own customers, instead of on information technology.

Cloud Computing

The most recent technology innovation among the three, cloud computing brings together the best qualities of virtualization and SaaS.  Like SaaS, with cloud computing both hardware and software are hosted by the supplier.  However, where the SaaS model is limited to a single supplier’s application, cloud computing uses virtual machines to host many different applications with one (or a few) suppliers.  Using this approach, the software can be owned by the customer, but hosted and maintained by the supplier.  When the customer needs to accommodate more users, the supplier sells the customer more resources and more licenses “on demand”.  Depending upon the terms of the contract, either the customer’s IT staff maintains the hardware, or the supplier.  In addition, in most cases, the customer can customize the software for their own needs, to better represent the needs of their own customers.

Adding Application Hosting, Virtualization and Cloud-Computing to the Mix – The Technology Acquisition Grid

Remember the dimmer switch I showed a few moments ago?  With the addition of application hosting, virtualization, SaaS and cloud computing to the mix, it’s not only possible to choose who owns and controls the future of the software, it’s also possible to decide who hosts the software and hardware – in-house or hosted with a supplier, as well as how easily it can be transferred from one environment to another.  That is, it’s now a true grid, with build-to-buy on the left-right axis, and in-house-to-hosted on the up-down axis.  The diagram below shows the Technology Acquisition Grid, with the four main combinations of options to consider then acquiring technology.

technology-acquisition-grid

 

Here’s where application hosting, SaaS, virtualization and cloud computing fit into the Technology Acquisition Grid:

technology-acquisition-grid-with-new-tech

 

Making a Decision to Host, Virtualize, go SaaS, or seek the Cloud

If the rules of the game have now changed so much, how do we make the decision to use virtualization, application hosting, SaaS or cloud computing, as opposed to traditional build and buy?  There seem to be a few key factors that drive the decision.

At the most basic level, it comes down to how much control – and responsibility — your organization wants over the development of the software and the maintenance of the system.  Choose an option in the top-left of the Technology Acquisition Grid, and you have greater control of everything; choose an option at the bottom-right, and you have far less control and far less responsibility for the system.

In my own experience advising clients during technology acquisition and leading technology initiatives, decision-makers tend to choose a “control everything” solution because it’s the easiest to understand and poses the least risk.   While this may, in the end, be the best answer, organizations should weigh the other options, as well.  Certainly, more control usually sounds really good, but it almost always comes along with much higher costs, as well as delaying use of the system by months.  Particularly for smaller organizations,  which probably need those IT dollars to serve their own customers more effectively, a “control everything” answer is often the wrong decision.

Which should your organization choose?  Start by making an effort to include software products that take advantage of hosting, virtualization, SaaS and cloud computing among your choices when you start your search.  Then, weigh the benefits and downsides of each option and combination of options, choosing the one that balances cost and time-to-market with your own customer’s needs and your tolerance for risk. A good consulting company like Cedar Point Consulting can help you do this, as can your organization’s IT leadership.  Using this approach, you’re sure to free yourself from the old rules of build-versus-buy, delivering more for your own customers at a much lower cost.

Donald Patti is a Principal Consultant with Cedar Point Consulting, a management consulting practice based in the Washington, DC area, where he advises businesses in technology strategy, project management and process improvement. Cedar Point Consulting can be found at http://www.cedarpointconsulting.com.

 


Intuitive to Whom? In Web Design, it Matters

During a recent Management Information Systems course I taught for the University of Phoenix, I posed the discussion question to students, “What do you think are the most important qualities that determine a well-designed user interface?” While responses were very good, nearly all of my students used the term “intuitive” in their response without providing a more detailed description, as though the term has some universal, unambiguous meaning to user interface (user experience) designers and web users alike.

I responded by asking, “Intuitive to whom?…Would a college-educated individual and a new-born infant both look at the same user interface and agree it is intuitive? Or, would the infant prefer a nipple providing warm milk to embedded-flash videos of news stories?”

Far from obvious, an “intuitive” user interface is extremely hard to define because “intuitive” means many different things to many different people. In this article, I challenge the assumption that “intuitive” is obvious and suggest how we can determine what intuitive “is”.

Nature and Nurture

Our exploration of intuitive user interfaces and user experience starts with “nature” and “nurture”, much like the “Nature versus Nurture” debate that occurs when explaining the talents and intelligence of human beings. For those of us who haven’t opened a genetics book in a few decades, if ever, “Nature” assumes that we have certain talents at birth, while “Nurture” proposes that we gain talents and abilities over time.

Certainly, “Nature” plays a role in an intuitive user interface. According to research by Anya Hurlbert and Yazhu Ling (http://ts-si.org/neuroscience/2464-sex-differences-and-favorite-color-preference), there’s a great deal of evidence that we are born with color preferences and that color preferences naturally vary by gender. In addition, warning colors like red or yellow, such as red on stop signs and yellow on caution signs, are likely a matter of science and genetics rather than learned after we’re born. So, an “intuitive” interface is partly determined by our genes.

“Nurture” also plays a big role in determining our preferences in a user interface. For example, link-underlining on web pages and word density preferences are highly dependent upon your cultural background, according to Piero Fraternali and Massimo Tisi in their research paper, “Identifying Cultural Markers for Web Application Design Targeted to a Multi-Cultural Audience.” While research in personality and user interfaces is still in its infancy, there’s a strong indication that CEO’s have different color preferences from other individuals, as Del Jones describes in this USA Today article.

But, what about navigation techniques, like tabs and drop-down menus? In a recent conversation with Haiying Manning, a user experience designer with the College Board, I was told that “tabs are dead.” This crushed me, quite frankly, because I still like tabs to effectively group information and have a great deal of respect for Haiying’s skills and experience. As a Gen-Xer who spent much of his teen years sorting and organizing paper files on summer jobs, I’m also very comfortable with tabs in web interfaces, as are my baby-boomer friends. My Net-Gen (Millenial) friends seem to prefer a screen the size of a matchbox and a keyboard with keys the size of ladybugs, which I have trouble reading.  (Nevertheless, Haiying is right).

In the end, because of “Nature” and “Nurture”, the quest for an “intuitive” user interface is far more difficult than selection of a color scheme and navigation techniques everyone will like. What appeals to one gender, culture or generation is unlikely to appeal to others, so we need to dig further.

It’s all about the Audience

In looking back on successful projects past, the best user interface designers I’ve worked with have learned a great deal about their audience – not just through focus groups and JAD sessions, but through psychometric profiling and market research. This idea of segmenting audiences and appealing to each audience separately is far from new. Olga De Troyer called it “audience-driven web design” back in 2002, but the concept is still quite relevant today.

Once they better understood their target customers, these UI designers tailored the user interface to create a user experience that was most appealing to their user community. In some cases, they provide segment-targeted user interfaces – one for casual browsers and one for heavy users, for example. In other cases, they made personalization of the user interface easier, so that heavy users could tailor the interface based on their own preferences.

They also mapped out the common uses (use cases or user stories) for their web sites and gave highest priority to the most used (customer support) or most valuable (buying/shopping) uses, ensuring that they maximized value for their business and the customer. More importantly, the user interface designers didn’t rely upon the “the logo always goes at the top left” mind-set that drives most web site designs today.

Think about the Masai

In hopes of better defining what “intuitive” is, I spoke with Anna Martin, a Principal at August Interactive and an aficionado of web experience and web design. Evidently, “intuitive” is also a hot topic with Anna, because she lunged at the topic, responding:

Would you reach for a doorknob placed near the floorboard; or expect the red tube on the table to contain applesauce? Didn’t think so. But what’s intuitive depends largely on what you’re used to.  Seriously, talk to a Masai nomad about a doorknob – or ketchup for that matter – and see what you get. And good luck explaining applesauce. (Cinnamon anyone?). Clearly intuition is dependent on what comes NATURALLY to a user – no matter what the user is using.

So why would the web be any different? It’s not. Virtual though it may be, it’s still an environment that a PERSON needs to feel comfortable in in order to enjoy. Bottom line is this…if you wouldn’t invite your 6 year old niece or your 80 year old grandmother to a rage (did I just date myself?) then don’t expect that every website will appeal to every user.

Know your audience, understand what makes them comfortable; and most importantly try to define what ‘intuitive’ means specifically in regards to sorting, finding, moving, viewing, reading and generally experiencing anything in their generation.”

So, audience-driven web design has firmly embedded itself into the minds of great designers, who must constantly challenge the conventions to create truly creative interactive experiences on the web. Consequently, as the field of user design transitions into a world of user experience, it’s going to require second-guessing of many of the design conventions that are present on the web today. This not only means pushing the envelope with innovative design, it also means we need to have a good handle on what “intuitive” really is.

Donald Patti is a Principal Consultant with Cedar Point Consulting, a management consulting practice based in the Washington, DC area, where he advises businesses in project management, process improvement, and small business strategy.  Cedar Point Consulting can be found at http://www.cedarpointconsulting.com.

Baby Bottles and Web Design


Departing Waterfall – Next Stop Agile

It’s been more than a year since I penned, “Before Making the Leap to Agile”, an article intended to guide everyone from C-level executives to IT project managers on the adoption of Agile. The goal was to offer up some of the lessons I learned through actual implementations, so that readers could avoid of some of the pitfalls associated with Agile adoption.  While a few saw it as an assault on Agile, many understood that my goal was to assist Agile adopters and thanked me for writing it.

Five-thousand-plus page views later on the last article, and I’ve finally cleared my plate enough to address an equally important topic: why people, and organizations, are making the shift to Agile from the more typical Waterfall. After all, Agile is a revolutionary approach to software development and it continues to grow in popularity, so I think it’s important for those who do not yet use Agile to understand why others have embraced it.

Why are people abandoning Waterfall and moving to Agile?

1. Agile is Adaptive. For the project team, as well as the business, Agile enables you to make quick changes in direction so that your software product and your business can respond to a rapidly changing business environment.

How? Agile teams use two-to-four week iterations, often called sprints, in which to develop and then release a working product.  At the end of each sprint, the team uses retrospectives to look back on the work completed and see how productivity can be improved; the team also works with the customer to determine which work should be accomplished during the next sprint.  One technique enables continuous improvement, the other enables the business to re-prioritize work based on changes in the business climate.  Together, they make Agile highly adaptive when compared to a Waterfall approach that effectively locks the team in to both a process and business strategy for a number of months.

2. WYSIWYG (What You See Is What You Get) Development. Many of us are familiar with this wonderful cartoon that shows how projects really work — at least in a Waterfall world. Notice how there’s an enormous disconnect between the first image, “what the customer asked for”, and the last, “what the customer really needed”.

Arguably, this happens because those of us in software development listen dutifully to what our customer says, document their words in the form of requirements, and then go off and build it, assuming all along that our customer knows not only what they want – but what their end customers want.  In reality, many of us have a rough idea of what we want and often less of an idea of what our customers want, particularly with software products that serve the masses (sure, focus groups and usability testing make a big difference, but still fall short in many instances).

Agile takes an entirely different approach to requirements gathering. Product features are identified for development and then the team works together with the business customer to build the features cooperatively. In many cases, user stories are written, screen mockups are drawn and simple business rules are written, but nothing too sophisticated. Instead, the Agile team relies upon heavy interaction with the customer or product owner to elicit requirements on-the-fly.

For example, not sure what the business customer wanted on a particular screen? Show them what you’ve got and see if it fits their expectations. Even if it is what they asked for, see if it’s going to serve their customer’s needs as they intended, or if it needs some refinement. Either way, if they want a change, change it. Using this nimble approach, there is little risk of misinterpretation of requirements and even less risk that the finished product misses the mark.

3. Shorter Time-to-Market. Let’s be honest here – who among us hasn’t reported to a C-level who has a great idea and wants something on the market – yesterday. (Heck, I’ve been guilty of this myself). Using a Waterfall approach, delivering anything to the marketplace takes months and sometimes years. But, by taking an Agile approach, the bare-bones features of a new product can be delivered in weeks, then, further enhanced to provide a truly robust solution. Again, the secret to shorter time-to-market lies in the use of iterations (sprints), with the end of each sprint another opportunity to deliver more features to the customer. Agile has this – Waterfall doesn’t.

4. Greater Employee Satisfaction. One of the oft-cited byproducts of Agile development is greater employee satisfaction – both by the project team and the line-of-business responsible for delivering the product. According to Steve Greene and Chris Fry, Salesforce.com reported an 89% employee satisfaction rating after adopting Agile when compared to only 40% before adoption.

In a similar vein, research by Grigori Melnick and Frank Maurer from the University of Calgary showed 82% of employees at Agile-adopting businesses were satisfied or very satisfied with their jobs, while only 41.2% were satisfied or very satisfied in non-Agile shops (2006, Comparative Analysis of Job Satisfaction in Agile and Non-Agile Software Development Teams).

5. Higher Quality. By most accounts, adopting Agile reduces defects and results in higher product quality. While personally I have seen Agile projects head in the wrong direction and suffer from higher defect rates initially, many sources have noted significantly higher quality on Agile projects. According to a 2008 survey by Version One, 68% of respondents to a survey on Agile adoption and corresponding results reported improved product quality as one of the benefits (3rd Annual Survey on the State of Agile Development). Similarly, David Rico, et. al report on average a 75% improvement in quality by adopting Agile (The Business Value of Agile Software Methods, 2009, J Ross Publishing).

6. Higher ROI. If there’s one single reason for the corner office to be sold on Agile, it has to be higher ROI. Because Agile reduces project overhead, delivers beneficial work more quickly and produces higher quality products, Agile also delivers a higher ROI to the businesses who adopt it. According to research that compiled data from multiple different sources, including Microsoft, Version One and the University of Maryland, Agile projects average an 1788% ROI when compared with Waterfall projects at 173% (The Business Value of Agile Software Methods, 2009, J Ross Publishing). While these numbers appear to be skewed toward the low side for Waterfall because the comparison only included CMMI-adopting organizations, that hardly makes up for a 10-fold difference between the two.

With all of this evidence residing squarely in the corner “for” Agile adoption, it’s sometimes hard to understand why Waterfall is still practiced. But the truth is, adopting Agile takes a paradigm shift in thinking that is not easy for individuals, much less organizations, to make. It also takes experience not only in practicing Agile, but also in managing organizational change, two qualities critical in Agile consultants.

This is why the Cedar Point Consulting team tailors its Agile implementations to each organization, choosing the tools and techniques that best match your industry and needs so that you avoid many of the pitfalls and have a successful adoption. It’s also why I have personally put so much time and effort into making Agile even more robust, not only by exploring Agile at Scale, but also by off-setting some of Agile’s weaknesses with common-sense approaches that nearly every business can implement.

So, go ahead, make the leap to Agile. Just be certain you’re taking the right approach to Agile adoption for your organization before you begin.

Donald Patti is a Principal Consultant with Cedar Point Consulting, a management consulting practice based in the Washington, DC area, where he advises businesses in project management, process improvement, and small business strategy. Cedar Point Consulting can be found at http://www.cedarpointconsulting.com.

 

Waterfall Model diagram