Gartner sees 'context-based computing' on the horizon

Being a CIO -- or working for one -- no longer means just keeping a data center and network running.

Being a CIO -- or working for one -- no longer means just keeping a data center and network running. Now the CIO and the IT division must become "entrepreneurial" to drum up new customers for the business and learn how to deploy cutting-edge technologies such as "context-based computing" -- or risk becoming irrelevant.

That was the message from a chorus of analysts at Gartner's 20th ITxpo Symposium in Orlando this week, where thousands of CIOs and senior-level IT professionals heard that they must think more like sales and marketing people to find new customers by using the tricks of their IT trade. That would not only mean the IT division supporting the usual e-commerce efforts on the Web, but stretching further to grasp and deploy "context-based computing" to mine sources of information, such as LinkedIn, Facebook and Twitter, plus GPS data and wireless mobile devices, to get new customers by pinpointing their identities in both the virtual and physical worlds.

Limitations of software-as-a-service highlighted at Gartner ITxpo

In Gartner's view, context-based computing goes beyond the business intelligence applications that IT supports today but will likely need to make use of them to process data culled from social networking and mobile-device use. According to Gartner, this is where the future is headed, and ignoring it while the competition plows into it could mean becoming an IT dinosaur.

Context-based computing is "going to change the IT industry and change the financial model," predicted Peter Sondergaard, Gartner vice president of research, adding there's going to be a sense of "discontinuity" because of it.

In addition, Gartner is encouraging "moving IT functions to the cloud," said Gartner vice president Nick Jones. That might mean a diminished IT department, and in Gartner's view, the new mission of IT should not be simply optimizing computer and network efficiency.

So what is this all about?

In a session entitled "Context-aware Computing Scenario: What CIOs need to Know," Gartner analyst William Clark sought to explain the process of combining various sources of information, such as GPS data, satellite maps and information that can be gleaned from mobile-device use, Facebook, Google and telecom providers to build a very targeted picture of the user.

"Wireless carriers and financial-services vendors already possess the essential information elements," he said, noting context-based computing exists in some forms today, such as with the anti-fraud monitoring methods the credit-card industry uses to detect suspect card use.

Vendors such as IBM, Digby, Netbiscuits and Opentext are showing how to blend personal information with customer-related information to create a picture of the individual. He acknowledged that it will "give you a Big Brother moment or two."

Patterns related to individuals can be compiled using "context-aware applications using ensemble programming techniques," Clark said. Context-aware ads and displays will become commonplace as "hyperpersonalization" takes off in earnest in the next few years, he said, predicting there will be an overall net economic impact globally in excess of $140 billion by 2015.

"Context is where e-commerce and the Web were in 1990," he added, saying mobile smartphones and other devices will be a way to deliver $13.9 billion in content-aware advertising by 2015.

"Identity, community, intent, environment -- it's about leveraging information about people, objects and places to make the user's experience more enjoyable," Clark said. "At the center is the end user."

But Clark admitted that people may not be ready for it. In a  recent survey that asked, "Can we use personally identifiable information to deliver services to you?" about 25% of U.S. respondents said they'd opt out. In addition, he acknowledged that furor over privacy issues and lack of effective opt-out plans may create an environment where both United States and European Union lawmakers step in to update legislation that will control the type of information used and who may collect it.

But mining social networks is part of playing the context-awareness card, according to Gartner analysts Carol Rozwell and Bill Gassman. LinkedIn, Twitter and Facebook are just some of the ways to find out about people, and to find out what they might be saying about your company, good or bad, and how much online influence they may have, the analysts said in their presentation.

"Facebook is an advertiser's dream," Rozwell said. "Facebook has come to the point they're selling this information." For instance, Facebook is selling information about "change in relationship status" to wedding photographers. Telecom providers can also "draw a social-network picture" based on call records, she said.

A new wave of vendors is adding in information gathering and analysis, including GalaxyAdvisors, Radian6 and Attensity by offering social-media collection and analysis software and services, while idiro, NetMap and Nimzostat do industry-specific analysis. There's also a reason to do deep analytics on social communications internally within organizations to determine why specific projects and their teams aren't working optimally to carry out goals, she added.For most Gartner ITxpo attendees, context-based computing lies in the future if ever. Their more daring ventures today involve decisions to shift some of their IT resources from on-premises data installations to cloud-based services whether public, infrastructure-based or private.

At ITxpo, Randi Levin, CTO for the City of Los Angeles, spoke on the topic of the city's ongoing e-mail migration by 30,000 employees away from an older Novell Groupwise e-mail system to Google Apps with systems integrator CSC involved in the contract, which has been posted online and gained a lot of attention.

Levin said the dire economic situation of Los Angeles -- the city's $400 million deficit this year and an $1.2 deficit expected in all counting 2009 to 2011 has meant loss of 30% of IT staff -- was a key factor in shifting to Google for e-mail, expected to save saving about $5.5 million over five years without a large capital outlay. "There's new infinite scalability and a better quality of service," she said.

But the ongoing transition to cut over many thousands of city employees has not been without its challenges. The LA Police Department, for example, has had specific security concerns related to electronic subpoena, and Google has taken steps to make specific changes, which Levin wasn't at liberty to discuss in detail, to accommodate the police department's need for better security. Other changes are expected in contract amendments.

Levin said one lesson learned about the migration to the Google cloud is it would be better to move off an older e-mail system quickly rather than try to run both in parallel. In addition, she said training users on the new cloud-based system has been challenging, especially for older employees with less Web experience than their younger counterparts. "We underestimated how much training we needed," she said.

The Tribune Company, which owns several newspapers, TV stations and media Web sites, last year made the shift from internal data resources to the Microsoft's Azure platform for cloud storage as part of consolidating 32 data centers down to three nationally.

Tribune's CTO Steve Gable, who spoke on the same Gartner panel with Levin, said the huge data volumes -- uploading 100 gigabytes of content each day -- is daunting, especially as there is a massive amount of video the Tribune staff produces that isn't even part of it yet.

"We're moving content back and forth, back and forth," said Gable, who acknowledged, "Moving content to the cloud was difficult."

Integrating the new technologies, such as the necessary application programming interfaces to do this, did mean a "struggle for developers," he said. "There's a challenge in scaling, and you don't build that three-tier architecture" that was common in the past. The main reason for the move to Azure was not primarily cost-savings but to try and gain increased agility and flexibility, with the intent to build out cloud-based applications in the future.

A third speaker on the panel, Eric Sills, director of advanced computing at North Carolina State University (NCSU), said the university created its own private-cloud computing arrangement a few years back based on IBM BladeCenter hardware, a XCAT cluster management tool, and Linux Cluster HPC to serve the student population in a "Virtual Computing Lab" that allows for a download of a menu of applications based on student authentication procedures. The NCSU system can also handle high-computing tasks from professors and grad students in times when the general student population tapers off, such as the summer.

Sills said he figures costs average about $2.20 per reservation with the department assigned $1.04 per hour for costs. He said  he's watching prices at places like Amazon Elastic Cloud 2 with an eye toward one day considered use of a cloud service to run servers, if the cost and computing jobs are right.

Read more about software in Network World's Software section.

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags social mediacloud computingmanagementinternetGartnerFacebookNetworkingtwitterData Centerhardware systemsSoftware as a service

More about Amazon Web ServicesBillBrother International (Aust)CSC AustraliaetworkFacebookGartnerGoogleIBM AustraliaIBM AustraliaLinuxMicrosoftNetMapNovell

Show Comments
[]