Monday 26 July 2010

Legacy Transformation

Legacy ... or is it?

The Macmillan dictionary has two suitable/interesting definitions for legacy in this context:
  • something such as a tradition or problem that exists as a result of something that happened in the past; and/or
  • something that someone has achieved that continues to exists after they stop working or die.
Generally when people in the IT business refer to legacy they are referring to an older generation application or system that: should be replaced as it is can't meet current business requirements or is technically outdated (costly/difficult to maintain).  I have also heard the term legacy applied to infrastructure (networks, platforms etc) but this is less common as most technologists tend to focus on application.

What I hope to capture here are some thoughts and ideas regarding legacy (or enterprise application) transformation.  The internet is full of methodologies and approaches for achieving this, and I won't bore myself by regurgitating this, but rather capture some key ideas and challenges.

Utopia

So here it is ... the Utopian view of "legacy" transformation.
  1. Build an integration layer (whichever suits your fancy ... or a combination of ... service oriented-architecture, message oriented middleware, enterprise services bus, message brokering, ...).
  2. Define the business (transactional and structural) and technical services.
  3. Insulate the "legacy" and deliver the service.
  4. Build a few additional services, perhaps extending the legacy function.
  5. Build a business process management layer (with associated workflow).
  6. Orchestrate the services.
  7. Link into a new shiny user interface, partner or other channel..
And if you achieve this you can renovate "behind the service" allowing you to replace legacy systems, as and when required.  The service is therefore a mediator or facade.

"Everything changes"

The biggest problem with this answer (assuming there is a question) isn't the solution but rather how people tend to go about realising it. There is a fundamental mindset that the past is wrong and that the people that built those solutions were naive or bad at what they did; and that we now know the answer and will build the future.  Brilliant! or not, as I am fairly sure (having spoken to several senior engineers) that they felt similar in the past.

If there is one thing that I have learnt that is more important that all other lessons, it is that everything changes.  An application implemented today is soon relegated to legacy.  A methodology that is guaranteed to improve quality/productivity/value will be modified or replaced in the future.

Therefore, smart organisations are those organisations that focus on building a core capability that enables them to adapt to change.  Subsequently these organisations will manage those assets that are important irrespective of the underlying technology.

Service Imperfect

Defining perfect business services is as much an art as it is a science.  Service granularity is a topic that promotes a lot of intellectual debate ... so I won't get into it.  However, irrespective of what services you specify you soon find yourself in the very real problem of realising/building those services.

Therefore a magic (the art) middle-ground exists between what business want and need and what you can deliver.   Can the enterprise applications deliver the function that matches that service or does the service need to be "course" and meet the availability of function?

Do you spend a lot of money renovating the enterprise applications to meet your requirements or do you deliver what is available?

Can you break into the middle of a legacy workflow or do you have to trigger the start of it and accept the outcome?  Can you monitor/report on the status of the legacy workflow?

The mythical enteprise business process layer

Looks phenomenal on a layered application architecture diagram.  However this is really difficult (read expensive) to achieve.    One of the big challenges relates somewhat to my ramblings on services.  The reason for this is that many legacy and/or enterprise applications have workflow and business process included within the application.  Unpicking this is very costly and seldom, if ever, delivers the business value need to fund the project.

So assuming that you have to deal with multiple process/workflow/bpm engines you need to figure out if it makes sense to promote one of these to enterprise level.   If so, how will you manage transactions and will you need to "integrate on the desktop" using portal technology?  How will you monitor status across these engines especially with long-running transactions?

A possible solution

Ultimately if you protect what is important and externalise as much of this as possible you will find yourself in a good place.  This includes data/information, business rules and calculations.

Solution components and patterns that help address this include canconical message models, meta data management, master data management, rules and calculation components/engines.

However, achieving this level of enterprise-wide harmonisation (even for a single component) is expensive and few achieve it.  Rather applying domain architecture principles and tackling those areas that you can control is infinitely more achievable. 

Applied this means determining which area you control (start, stop, change direction) when it comes to designing and implementing a solution and then focussing on that.  This could be a vertical (department for example) or a horizontal within the enterprise.   

Now the big downfall is that enterprises have to contend with multiple standards, formats etc.  However, because this is a known it means that it can be managed by design.  Bridging/translating data formats (for example) becomes a core capability and by default the organisation learns to manage change.  Interfacing between partners and applications falls into a similar bucket ... you get the idea.

The big "gotcha" is investment - it does mean that the investment case needs to be realised on a smaller business scope; but the reality is probably closer to this than those business cases that have an enterprise wide adoption assumption.

So what does the architecture look like?

Well this is where it gets really fun.  If you are focussing on the management of change and your response to it, then you can begin to imagine what this looks like.    Think domains.  Think service delivered and contracts.  Think measurement and reporting.  Think blackbox and component paradigms.

Ideally select a domain where you can slice deep into the organisation whilst keep it as narrow as possible.  This will allow you to touch on as many solution elements and layers and therefore tackle risk sooner rather than later.  However, take a horizontal domain or component domain works as well.

But more on this later.

Tuesday 13 April 2010

Brainstorm: Getting on with Green

Brainstorm printed a great article in their physical magazine on Green and accompanied it with this online piece.    I had the chance to talk to Samantha regarding IBM's Eco-Efficiency Jam and our view on rethinking "Green".

Despite the multitudes doing not much, a number of companies have risen to the challenge and gone green

While you’d be forgiven for thinking that green IT is never going to happen (and let’s be realistic – will IT ever be really green?), some companies have taken the gap and started on green projects, green buildings, green data centres and green organisations.

GijimaAST’s new head office in Midrand is a green building, although as there are several buildings involved, the word ‘campus’ is probably more appropriate.

Jan Smit, DCS business unit executive driving green IT at the company, says that Gijima has engaged with green IT and is busy establishing sustainability around not just green IT, but how organisations can reduce their carbon footprint, be more environmentally friendly and adapt to the pressures around global warming.

“Nobody’s really doing it,” he agrees. “We’re all talking about it and how good we are but no one has gone out and said, ‘Here, we can prove it’. So we’ve started a business unit for it and said we’ll go green, become environmentally friendly and use that experience.”

Net App has revamped its data centres to good effect. Giancarlo Scaramelli, NetApp EMEA IT field services manager, who is part of the ‘Net App on Net App’ program that deals with the company’s internal IT, says that by working aggressively to make its data centres as efficient as possible, the company has saved 40 000 kWh per month.

“We started retro-fitting our older data centres and building new ones with green in mind. The only way to do it was to take a holistic approach, and not just look at what we do (storage) but the whole centre.

“We started in the California data centre. The temperature was often far too low – we needed to work out the optimal operational temperature and so we’ve gone from 11 degrees to closer to 23 degrees. This saves a lot of energy because the need for cooling has reduced. We also looked at whether or not we need to cool the air all the time. In California we can use free air cooling 70 percent of the year.

“Many data centres were designed years and years ago. We still see centres with raised floors, which require pressure to get cold air to the ceiling, so we decided to do it the other way and pump cool air in from the top and let it fall naturally to the floor.”

Installing plastic sheeting, like that used in supermarket coolers, to separate hot aisles and cold aisles proved an inexpensive way to keep air flows apart.

Global view

While many see green as a CSI initiative, or at best a cost-saver, IBM CTO Clifford Foster has a slightly different approach: “People seem to have a fairly narrow view and quickly revert to talking about is a processor green or efficient, where the way I like to look at it is if you work on the assumption that green is doing more with less, i.e. how can we have a positive impact on the environment and the bottom line by doing more with what we have, then green certainly has ROI. You have no need to buy more, which has a direct and equal impact on the environment, or at least less impact.”

In January, IBM convened a Global Eco-Efficiency Jam – a massively-scaled online discussion centred on this strategic, business-critical issue. The Jam enabled senior representatives to co-operatively determine the best actions that can be taken to meet goals for a sustainable future. For 51 hours, thousands of public and private sector sustainability leaders pooled their knowledge and experiences through a series of focused discussions on Green Infrastructure, Sustainable Operations, Intelligent Systems, and Raising the Bar.

It’s something IBM, which started its green initiatives in the ‘80s, takes seriously.

“There’s a wide spectrum of options here,” Foster says. “If you look at the entire dimension of an organisation, even from a data level, if you are managing information and data in your enterprise well (through deduplication, archiving historical information and so on), it requires less storage to keep it. As you move up the stack, the same starts to apply, so you can buy less, and less is manufactured. If you look at applications working on the data, if those are more efficient, you can reduce the processing required, need less processing cycles and have an equal impact, and that goes on and out.

“Look at servers, data centres, the actual building the data centres reside in. Go up the food chain and look at how the organisation can optimise internally, and optimise with trading partners. It becomes a most compelling case for green IT as well as directly impacting effectiveness of the organisation,” he states. And he’s right too.

Tuesday 6 April 2010

Engineering News: As urbanisation gathers pace, African city managers are urged to adopt technology

Engineering News published this great article on the presentation that I delivered at the IDC Africa CIO Summit - linked and copied here for reference.

By 2050, city dwellers are expected to make up 70% of the earth’s total population and Africa has the opportunity to build intelligence into the infrastructure of its cities, says information technology services and solutions provider IBM chief technology officer for sub-Saharan Africa Clifford Foster.

“Africa has the opportunity to learn from the rest of the world’s mistakes and build ‘smartness’ and intelligence into its infrastructure. Cities are the microcosm of all the significant challenges and opportunities facing the planet,” he says.

There are a number of ways to infuse intelligence into cities and make them smarter. Smart telecommunication can enable cities to interconnect systems and lay the groundwork for longer-term economic growth. This includes smarter traffic systems, healthcare systems and food systems.

Foster says that smart transportation could improve transit experiences, reduce congestion and limit the effect of transport on the environment. Road user charging, electronic fare management and transportation information are examples of smart transportation in cities.

He points out that smart healthcare can provide opportunities to improve healthcare quality, accountability and sustainability. Health information exchanges can allow more time for treating current illnesses than the current system does, which is based on medical histories, and consumer portals can allow customers to proactively manage their own health.

Smart energy and utilities can provide opportunities to manage energy supply and demand smartly. Smart grids can detect and pre-empt problems and possibly reroute energy flow, and smart water management could detect water leakage and loss as well as contamination. Foster says that all buildings should also be built to consume energy better.

Further, cities can nurture their most valuable resources, their citizens, by using smart education, such as smart classrooms, which make remote educa- tion more interactive, and smart administration.

Citizens and communities can be protected by smart public safety systems that turn data into insight. Crime data aggregation, emergency management integration and smart surveillance are a few examples of how cities can become safer environments. He says that smart government services can also be implemented to infuse intelligence into needed services, stimulate economics and save taxpayers time and money. This will promote government accountability and integrate service delivery.

“From an African perspective, it is cheaper to implement these technologies when building infrastructure than installing them later on. One should consider how to embed intelligence into a system when building is first taking place,” Foster concludes.

Wednesday 17 March 2010

Smarter Cities at the IDC Africa CIO Summit

I was fortunate enough to present at the Africa CIO Summit again this year and have included my presentation (sans video) in this post.

Last year I discussed IBM's Smart Planet Strategy and how technology could enable this vision.  In addition I discussed the role that technologists and strategists had in changing the way that we work and live.

This year I discussed Smart Cities as it is in cities that we find a "system of systems" where key systems interact in a unique way.  I focused on examples from around the globe and how this could be applied to Africa. 

Wednesday 10 March 2010

Tackling the corporate “skills” challenge

There are two different, but related, issues that require addressing when tackling the so-called “skills shortage” and “skills challenge”.

The first area is addressing the skill and experience level of people currently within an organisation; and the second area is meeting the capacity or demand for people with a specific skill set and experience level.


Skill and experience


This is the easy part and subsequently most organisations focus the majority of their effort on solving this problem. Organisations improve the skill and experience level of their resources through training interventions and providing the right project/work experience. It is worth noting that it is far more difficult to do the latter (project/work experience) as it requires organisations to take a slightly longer-term view of their workforce requirements.

Resources can be up-skilled and cross-skilled and this is certainly a viable strategy as it allows resources to move into new areas, which creates a gap at the lower skill area that is easier to fill. This is akin to succession planning although it is managed at a group level and not an individual level.

Capacity

Understanding the capacity problem starts with understanding what the gap is and will be. This gap is the result of what the organisation anticipates they will need per major skill/profession area (to meet current and future demand) versus what the organisation currently has at its disposal. Thereafter it is necessary to apply a workforce strategy to determine what percentage of this gap should be permanently hired versus contractors and partners.

The strategy for closing this gap should include defensive (retention) and recruitment/attraction approaches. From my perspective, the most effective retention strategies are those that focus on career and profession development. This is supplemented by meeting hygiene requirements (salary, benefits) and very importantly providing a “sense of belonging”/community involvement.

The other element of addressing capacity requirements is recruitment (in conjunction with securing contractor and partner agreements for that aspect of the workforce). However, if hypothesis is true, and there is a shortage of adequately skilled resources in the market, then we need to tackle this differently as it is not cost effective to over-pay for candidates - at least not as a general rule.

Transformation

This is where it gets really interesting as the only option is to actively transform the current and potential workforce. Let’s looks at both of those elements.

Transforming the current workforce means growing your existing resources to create capacity at the bottom of the career ladder and recruiting/training junior people to fill the gap. I believe that this is the long-term intention of our government as it is through this type of transformation that we will provide the country with the professionals it requires to be globally competitive.

Transforming the potential workforce is far more difficult. Corporates and other professional organisations work closely with tertiary education organisations to ensure that the workforce of tomorrow is adequately trained and employable. However, the tertiary education organisations themselves have to work closely with secondary and primary education organisations to ensure that they have an adequate pool of candidates to train for employment by the pubic and private sector.

This means that the supply chain extends to primary schools and as such it is important to support and actively participate in education programmes at every level within our country or region that we operate.

Friday 5 March 2010

Event Processing for a Smarter Planet

A Smarter Planet

You've heard it before but it warrants repeating: Every natural and man-made system is becoming instrumented, interconnected and intelligent. We now have the ability to measure, sense and monitor the condition of almost everything. Furthermore, people, systems and objects can communicate and interact with each other in entirely new ways. This means that we can respond to changes quickly and get better results by predicting and optimising for future events.

That's the good news but there are challenges.

Today, more than ever, organisations are under pressure to leverage a wealth of information to make more intelligent choices.

Volume of data: Data volumes are expected to grow tenfold in the next three years. This is driven by the proliferation of end-user devices, real-world sensors, lower-cost technologies and population growth.

Velocity of decision-making: The market demands that businesses optimise decisions, take action based on good information and utilise advanced predictive capabilities - all with speed and efficiency.

Variety of information: With the expansion of information comes large variances in the complexion of available data - very noisy with lots of errors and no opportunity to cleanse it in a world of real-time decision-making.

Shift in what we analyse: Enterprises need a broader, systems-based approach to the information they examine and optimise. Stream computing and event processing capabilities are enabling the analysis of massive volumes.

Event Processing

Event processing provides us with an approach to leverage this wealth of information for competitive advantage and for the betterment of people

An Event: Any signal indicating that a change of state (of the business, real-world or other domain) has occurred or contemplated as having occurred.

Simple Event Processing (SEP): Simple event processing is not a new concept. SEP provides functions to detect and respond reactively to a single source, or homogeneous event type.

Complex Event Processing (CEP): Detect and respond reactively to patterns among like or related events, missing events and aggregate events. CEP supports high volumes of homogenous events within a predictable time and order.

Business Event Processing (BEP): Extends CEP and provides a graphical, non-programmatic user interface that allows business users to manage event processing logic themselves. Supports high volumes of heterogenous business events and complex patterns that occur in no particular time or order.

Event processing is complementary to SOA

SOA is an approach to modelling your business, and hence IT, as a set of reusable activities or services. Services are then orchestrated or composed into coarser services, processes or application. SOA communication is typically (or perhaps narrowly understood to be) implemented as request/response pattern.

Event processing adds an asynchronous pattern to SOA components and services. These patterns are complementary. Services can emit events and consume events. The event processing itself is performed as an extended capability of an ESB.

Conclusion

Event processing, and especially business event processing, begins to deliver on the "Utopian ideals" of real-time analytics and business-configured processing.

This is big ... and it going to get a lot bigger.

Monday 1 March 2010

Mind the Gap

Change

Businesses are experiencing rapid disruptive change, driven by such shifts as internet-enabled businesses; collapsing time; globalisation of markets and business configurations; social and economic imperatives; financial uncertainty, and changes in industry boundaries and structure.

This is leading to new business styles such as on-demand, value ecosystems, collaborative networks, and the globally integrated enterprise; which in turn place very different demands on operations and IT.

As Wytenburg states "the greater the degree of complexity in an environment, the more various, dynamic, and unpredictable are those situations“. We have moved away from a known single future, away from even from a range of possible futures, to a truly ambiguous future. There are too many permutations of possible architectures for such an uncertain future.

Significance

As with all uncertainty we need to focus on the "architecturally significant" elements - in this situation it refers to those elements of an organisation that will enable success in a future where the pace of change is accelerating.

Osterwalder's Business Model Ontology provides a good solution context for considering what is significant. I would add an additional component though that I will call "The Core".

The Core

The Core extends the ontology proposed by Osterwalder by adding an additional Core element associated to Core Capabilities. The Core that refers to that differentiating aspect of an organisation that transcend products and services, as is traditionally defined as “core business”. It is not an idealised construct but the tangible essence of an organisation that includes the key people, their vision and values.

In this context the Core combines (a) Vision with (b) People. Initially the people are those that formulated the vision for the organisation. These people lead the organisation and are the solid aspect thereof. The core (of the organisation) can, and probably should, be extended during the life of an organisation.

Key Capabilities

Within this construct, the key capabilities of the Core are those that position an organisation architecturally for an uncertain future:
  • Manage core values and value streams: At the heart of what allows an organisation to adapt while maintaining some consistency in the eyes of its key stakeholders (customers, partners, investors, employees). They are internal beliefs which guide the decision-making and behaviour of the organisation.
  • Shed and Attract: Agility is a key attribute of future business which has resulted in a marked shift from a 'loyalty/purpose' based business context to an 'opportunity' based context. As this trend continues, businesses who have a core competence to shed and attract will be able to maintain their competitive position over those who are less able to do so.
  • Communicate: There is a requirement for richer communication with all stakeholders, from employees to customers. The ability to establish trust through one's actions and communicate that and a vision for the future will differentiate the winners from the losers.
  • Sense and Respond: This is the ability to detect change and respond rapidly to it. Included in this capability is the ability to manage "future uncertain" through scenario planning and management.
  • Manage Information: Managing master data and information is critical to the decision-process and is a core ability, without which organisations will flounder.
The Gap

It is important for organisations to "mind the gap" when they shed components of their businesses that are better delivered by a supplier or through the network. This gap exists between the formal contracts between components, which is often covered by the good nature, professionalism or natural instincts of people within a business.

As business components become increasingly logically and physically seperated it will become increasingly important to understand whether the sum of the contracts equals the whole; and whether the whole still delivers what people are accustomed to. Managing that difference will be crucial.

ITWeb: Sci-fi meets society

And the second article by Lezette (ITWeb):

Sci-fi meets society

As artificially intelligent systems and machines progress, their interaction with society has raised issues of ethics and responsibility.

While advances in genetic engineering, nanotechnology and robotics have brought improvements in fields from construction to healthcare, industry players have warned of the future implications of increasingly “intelligent” machines.

Professor Tshilidzi Marwala, executive dean of the Faculty of Engineering and the Built Environment, at the University of Johannesburg, says ethics have to be considered in developing machine intelligence. “When you have autonomous machines that can evolve independent of their creators, who is responsible for their actions?”

In February last year, the Association for the Advancement of Artificial Intelligence (AAAI) held a series of discussions under the theme “long-term AI futures”, and reflected on the societal aspects of increased machine intelligence.

The AAAI is yet to issue a final report, but in an interim release, a subgroup highlighted the ethical and legal complexities involved if autonomous or semi-autonomous systems were one day charged with making high-level decisions, such as in medical therapy or the targeting of weapons.

The group also noted the potential psychological issues accompanying people's interaction with robotic systems that increasingly look and act like humans.

Just six months after the AAAI meeting, scientists at the Laboratory of Intelligent Systems, in the Ecole Polytechnique Fédérale of Lausanne, Switzerland, conducted an experiment in which robots learned to “lie” to each other, in an attempt to hoard a valuable resource.

The robots were programmed to seek out a beneficial resource and avoid a harmful one, and alert one another via light signals once they had found the good item. But they soon “evolved” to keep their lights off when they found the good resource – in direct contradiction of their original instruction.

According to AI researcher Dion Forster, the problem, as suggested by Ray Kurzweil, is that when people design self-aggregating machines, such systems could produce stronger, more intricate and effective machines.

“When this is linked to evolution, humans may no longer be the strongest and most sentient beings. For example, we already know machines are generally better at mathematics than humans are, so we have evolved to rely on machines to do complex calculation for us.

“What will happen when other functions of human activity, such as knowledge or wisdom, are superseded in the same manner?”

Sum of parts

According to Steve Kroon, computer science lecturer at the Stellenbosch University, if people ever develop sentient robots, or other non-sentient robots do, we'll need to decide what rights they need. “And the lines will be blurred with electronic implants: what are your rights if you were almost killed in an accident, but have been given a second chance with a mechanical leg? A heart? A brain? When do you stop being human and become a robot?”

Healthcare is one area where “intelligent” machines have come to be used extensively, involved in everything from surgery to recovery therapy. Robotic prosthetics aid people's physical functioning, enabling amputees to regain a semblance of their former mobility. The i-Limb bionic hand, for example, uses muscle signals in the remaining portion of the limb to control individual prosthetic fingers.

More “behavioural” forms of robotic medical assistance, such as home care robots, are also emerging. Gecko Systems' CareBot acts as a companion to the frail or elderly, “speaking” to them, reminding them to take medication, alerting patients about unexpected visitors, and responding to calls for help, by notifying designated caregivers.

According to Forster, daily interaction with forms of “intelligent” machines is nothing new. “We are already intertwined with complex technologies (bank cards, cellphones, computers), and all of these simple things are connected to intelligent machines designed to make our lives easier and more productive.

“The question is not 'should' we do this, we already do, but how far should we go?” He adds this question is most frequently asked when one crosses the line of giving over control to a machine or technology that could cause harm.

While the advent of certain technologies, such as pacemakers, or artificial heart valves, or steel pins inserted to support limbs are generally beneficial, explains Forster, their progression could bring complexities.

“These are all technologies that make life better, and are designed to respond to environmental changes in order to aid the person in question. But, if my legs, arms, eyes, ears and memory are replaced by technologies, the question is when do I cross the line and stop being human and become a machine?”

Sun Microsystems founder William Joy is one of the industry's more outspoken critics of people's increasing dependence on technology, and warns in a 2000 Wired article: “The 21st-century technologies – genetics, nanotechnology, and robotics – are so powerful that they can spawn whole new classes of accidents and abuses.

“Most dangerously, for the first time, these accidents and abuses are widely within the reach of individuals or small groups. They will not require large facilities or rare raw materials. Knowledge alone will enable the use of them.”

Drones to decision-makers

Another area of contention regarding the increasing involvement of “intelligent” machines is in military applications. A recent US mandate requires that a third of its military forces be unmanned – remote controlled or autonomous – in future. While this could significantly reduce the number of human casualties in fighting, it also raises fears around autonomous machines' power to drop bombs and launch missiles.

Some argue that giving robots the ability to use weapons without human intervention is a dangerous move, while others say they will behave more ethically than their human counterparts. While these realities are still a way off, they have raised concerns over what is considered ethical behaviour.

The AAAI writes in a 2007 issue of AI Magazine that there's a considerable difference between a machine making ethical decisions by itself, and merely gathering the information needed to make the decision, and incorporating it into its general behaviour. “Having all the information and facility in the world won't, by itself, generate ethical behaviour in a machine.”

Clifford Foster, CTO of IBM SA, says ethics is something that has to be tackled across industry, with various people, including the public, collaborating to make sure policies are in place. “You can't advocate responsibility to machines. There are certain cases that present opportunities for technology to assist, such as telemedicine, but then it may be necessary to limit this to certain categories, with final decisions resting with professionals.”

In addition, as machine capabilities increase from automated mechanical tasks to more high-level, skilled ones, it calls into question their competition with humans in the workforce. “I think we've been staring one of the simplest ethical issues in the face for a few centuries already, and we still haven't reached consensus on it,” says Kroon.

“How do we balance the need of unskilled people to be employed and earn a reasonable living with the benefits of cheaper industrialisation and automation?” He adds the issue will only get more glaring in the next few decades, as the skill level needed to contribute meaningfully beyond what automated systems can do increases.

“Things like search engines have already radically changed how children learn in developed countries. If we simply dumb down education, that would be a pity,” notes Kroon. “We need a generation of people who can utilise the new capabilities of tomorrow's machines, rather than a generation of people who can contribute nothing meaningful to society, since any skills they possess have been usurped by machines.”

ITWeb: AI comes of age

First of two articles by Lezette (ITWeb) on Artificial Intelligence that I contributed to:

AI comes of age

The focus of artificial intelligence (AI) research has undergone a shift – from trying to simulate human thinking, to specific “intelligent” functions, like data mining and statistical learning theory.

Steve Kroon, computer science lecturer at the University of Stellenbosch, says, in the past, people were enthusiastic about machines that could think like people. “Now, many researchers figure the challenges of the present day are things we need 'alternative intelligence' for – skills that humans can make use of, but don't have themselves.”

The best examples of these “alternative intelligence” fields, notes Kroon, are data mining and machine learning – using advanced statistical analysis to find patterns in the vast amounts of data we're confronted with today.

“That's not to say research isn't being done on human-like AI, but even tasks like speech recognition and computer vision are more and more being seen as tasks that will yield to statistical analyses.”

In the 1950s, researchers began exploring the idea of artificially intelligent systems, with mathematician Alan Turing establishing some of the characteristics of intelligent machines. Consequently, work on AI spanned a wide range of fields, but soon developed an emphasis on programming computers.*

“In the past, there was a lot of research on rule-based systems and expert systems,” notes Kroon. “But now, we're faced with areas where there's so much data, that even the experts are at a loss to explain.”

The triumph of these methods, according to Kroon, is that they're discovering things the experts aren't aware of. “Bio-informatics is a great example of this; analysing the data leads to hypotheses, which the biochemists and biologists can attempt to verify, so this new knowledge can help in the development of new medication.

“We're seeing the shift from computation as simply a tool for use by the researcher to validate his hypotheses, to computation being used to generate sensible hypotheses for investigation – hypotheses humans would probably never have found by manually looking at the data.”

Information generation

Artificial intelligence has become so ingrained in people's daily lives that it has become ordinary, says professor Tshilidzi Marwala, executive dean of the Faculty of Engineering and the Built Environment, at the University of Johannesburg.

“Fingerprint recognition is now a common technology. Intelligent word processing systems that guess words about to be typed are now common. Face recognition software is used by security agencies – the situation has shifted to more advanced and realistic applications,” he states.

“We're already seeing automated systems for so many things in our daily lives,” notes Kroon. He adds that people are reading and remembering facts less than they did a generation ago, relying instead on being able to look up information on the Internet, whenever they need it.

“Combine that kind of technology with things like recommender systems and location-aware tools, and soon you'll have a constant stream of information relevant to you, available for your consumption as you need it.”

A project exemplifying this trend is the Massachusetts Institute of Technology's (MIT's) SixthSense prototype, a gestural interface that projects digital information onto physical surfaces, and lets users interact with it via hand gestures.

“When I look around and see how many people are now using mobile smartphones instead of the desktop computers of a couple of years back, and this SixthSense technology they've been prototyping at MIT, I get the impression that 'augmented intelligence' is going to be a big thing in coming years,” says Kroon.

Everyday AI

AI has become such a part of our daily lives that it has become ordinary.
Web search technologies are widely seen as an application of AI, he adds, with Wolfram Alpha being a prominent example. The “knowledge engine” answers user queries directly by computing information from a core data base, instead of searching the Web and returning links.

“Its premise was that people want answers to questions, not just a list of links. And I think they're right, but there's a long way to go before this is powerful enough to dethrone the classical search engine approach,” states Kroon.

“Understanding the question being asked, and trying to infer context for that question, are difficult challenges in AI before one can even start to construct an answer to the question,” he adds.

Another development in this direction is the new “social search engine” Aardvark, recently acquired by Google. “Aardvark uses machine-learning techniques to understand social networks, and then provides answers to a user's query by passing it on to people that its system believes are the best to answer the query,” explains Kroon.

“So, in this model, Aardvark's 'AI' is simply responsible for teaming up a person with a question and someone who can give that person a good answer. This sort of system works well when you're looking for more personalised responses, like hotel and restaurant recommendations, as opposed to the impersonal information typically served up by a regular search engine.”

Mind over matter

Clifford Foster, CTO at IBM SA, says AI offers significant ways of handling the explosion of data in the world. This follows from the use of computers to simulate intelligent processes, and understanding information in context, notes Foster. “This can be applied in a number of areas, such as a recent system to predict and understand the impact of anti-retrovirals on HIV patients.”

The vast amount of information being generated, and the need to process that information in near real-time, to prevent problems from happening, is simply too much for humans to compute fast enough, explains Foster. “So, if you give machines the ability to analyse and respond to this data, it fundamentally changes the way we manage and use information.”

He points to applications, such as orchestrating traffic lights according to traffic flow at various times of the day, or medical diagnoses for people in remote areas.

“One of the biggest healthcare challenges in Africa is limited access to medical professionals. But if a person could present their problem to a computer capable of understanding the symptoms, it could search medical data banks for related content, ask additional questions for greater accuracy, and provide an informed diagnosis, which could then be passed on to a professional.”

Foster believes investment will continue in areas where AI improves people's lives. “It's become less about trying to replicate the brain and more about complementing the way we connect with the world.”

For example, trying to clone the kind of processing involved in catching a ball, with the need to calculate the trajectory of the ball, activating the correct muscles to catch it, and absorbing the impact of the catch, requires a phenomenal amount of computing power, notes Foster, but it's not very useful.

“For a long time, people were confined by the idea that AI must simulate the human brain, but where's the value in that? A programme that can aid people in solving problems, whether it be running their data centre, or managing traffic flow, or reducing the mortality rate, is much more valuable.”

Marwala argues that intelligent machines will always be created to perform a particular or handful of functions. “An intelligent machine that performs many tasks is as elusive a concept as 'the theory of everything', but the adaptation and evolution of a machine performing a specific task is perfectly possible.”

Foster believes the intersection of technology, business, and people is where AI research is going, and where it can have most impact. “Using intelligent systems to get things quicker to assist in areas such as healthcare and education could have a profound impact on society, and change people's lives in ways they never thought of.”

Thursday 4 February 2010

ITWeb: Eco jam sparks green thinking

Thanks to Lezette from ITWeb for this great article on IBM's Eco Jam.

Eco jam sparks green thinking

In a world facing unprecedented demands on scarce resources, consumers and businesses will have to revolutionise the way they use, monitor, and manage information.

This was one of the key messages emerging from IBM's Eco Efficiency Web jam last week, which saw thousands of business leaders and industry experts from around the world exchanging ideas around efficiency and sustainability.

Clifford Foster, IBM's CTO for sub-Saharan Africa, says the eco jam fits in with the group's Smart Planet agenda, which involves identifying scarce resources and finding a way to better manage them. “Smart Planet is about making better use of the resources we have available, for the benefit of the environment, people, and organisations' competitiveness.”

According to Foster, the jamming concept allows for the gathering of as many viewpoints as possible, so contributors can pool their knowledge through themed discussions, in a drive to encourage collaboration in tackling the world's eco challenges.

Foster says the main goal is to stimulate discussions between people who have never connected before.

”Think of it as forums or wikis on steroids, with people sharing ideas to generate new lines of thought.”

According to Foster, a total of 925 client companies from 45 countries participated in the Web-based brainstorming event, with more than 2 000 posts on six key themes, including renewable energy, sustainable technologies, regulation, and green infrastructure.

“While we're seeing an unprecedented demand on water and energy supply, at the same time, there's an interesting trend in technology, which puts us in a better position to manage this demand,” notes Foster.

“One is instrumentation – by putting sensors on everything, we can detect, for example, water leakages, or contamination, or where water is being used in factories and houses, in real-time. If we connect this information to services such as disaster management and healthcare, we can make intelligent decisions, preventing problems before they happen.”

Connected intelligence

By connecting various streams of information together, says Foster, organisations and consumers can move towards predicting events, making smarter decisions, and ultimately doing things differently. Working more efficiently will also see a reduced impact on the environment.

“If we want to tackle the mega problems, such as non-renewable energies and the Smart Planet agenda, these solutions touch on so many organisations and interested parties that you need everybody to want to play,” explains Foster. He uses smart shipping as an example, with the numerous groups involved – from the manufacturer of the goods, to the customer, to the freight and shipping companies, to government from a revenue perspective.

“To solve the big problems, we need to look at things from an info-sharing perspective. In many cases a lot of the information needed to make more proactive decisions resides in a metaphorical island. Until you get companies to share information and look for trends and complementary solutions, we're not going to be able to mine data for the insight needed to do things differently,” he argues.

Foster cites a case in China, where IBM implemented a smart metering system at Dong Energy, consisting of sensors on the entire energy network to detect outages in real-time. “With the way the energy network is configured, you can then re-route energy around the faulty area, and substantially reduce outages in networks,” he explains.

“Energy outages impact everybody – they impact production, an organisation's productivity, and people's lives, and cost money.” Foster adds that smart metering in houses and businesses is an effective way of reducing this impact, as it costs a lot more to fix a problem than to manage it.

“From a consumer perspective, imagine a house where you can see the power usage on single dashboard, which tells you what the energy consumption is at different times of the day, and which appliances are actually consuming energy. Then you can make more intelligent decisions about when you switch appliances on or off. It's pretty important to people to have better understanding of how to better use available energy.”

Changing focus

Foster says the discussion he was driving during the jam was how to change the perception of the green agenda from one that just focuses on the environment, to something that makes good business sense.

“If you use less energy, your goods cost less, which means higher profitability. At the same time, using less energy not only costs less, but allows you to store energy for off-peak times, which makes more energy available to citizens at a decent price. So there's a positive impact for people and business.”

Going forward, Foster anticipates a holistic approach to resource management. “I foresee sustainability being baked into the metrics and executive agenda at all organisations, as a way to measure themselves, and the people running the business unit. This will happen more and more as government starts putting in regulations.”

From a technology side, Foster says effective monitoring of information will have a profound effect on the amount of energy houses and businesses save. “Every single appliance and device is going to be smart and aware enough to measure its own energy consumption, and communicate that data so it can start up or shut down when required.”

He adds that SA is well situated to take advantage of this development. “Prepaid energy meters are a well-established energy device in the country, and the next generation of these will not just be about switching off when you run out of credit, but actually being able to decide whether you switch off at certain times of the day, or switch off automatically.

“If you take that technology into businesses, we'll see a profound reduction in energy demand, apart from the actual devices using less energy.”

He adds that the recession has served as an important catalyst for change. “In a period of excess, everything is cheap, and people don't really think about how they use resources. But the age of excess is over. The facts are that the demand on resources we have is outstripping supply, and have to use it more efficiently.“

From an African perspective, says Foster, there are a different set of challenges focusing attention on resources. “Eskom does not have enough energy to meet demand, and it's forcing us to think about things differently. Innovations are born in areas where we have these kinds of challenges, and Africa is a breeding ground for these kinds of innovations.”

Wednesday 27 January 2010

2010 Predictions

ITWeb's Biz Beat Newsletter carried this article on my predictions for 2010 and beyond:

Today, a new instrumented, intelligent, and interconnected world, a smarter planet, is emerging that offers great promise for addressing some of the our most pressing problems. It also creates an entirely new set of business and technology issues that will need to be considered.

Organisations need to rethink their security strategies and implement fine-grained control of each resource they want to protect. In addition, organisations will have to implement a multi-tiered containment approach to prevent one breach from disrupting the entire organisation. Businesses need to implement far-field detection strategies and tools to prevent problems from happening before they occur.

Cloud computing will increasingly be part of an innovative computing approach that will create and meet the needs of a more dynamic infrastructure, allowing organisations to respond to the opportunities and challenges of a smarter planet.

The world's transition to a services economy has been in the making for the past three decades. A standardised IT factory or services factory is emerging to enable consistent high-quality delivery of services. These factories will enable an adaptable, intuitive interface to the service consumer, where a positive experience is engineered and delivered, along with the planning and anticipation that goes into excellent exception handling.

Backstage, consistent delivery is enabled through standardisation, automation, and learning systems for continuous improvement.

The next generation of data analytics is emerging, which will enable organisations to analyse information in near real-time. It is increasingly possible not only to help a decision-maker identify possible actions to choose from, but also to evaluate those options. After the decision is made and acted upon, its outcome must be monitored and the accuracy of the predictive models evaluated.

Applications that respond to the opportunities of this new world exhibit requirements that are much broader than those found in traditional applications. Hybrid systems architecture is emerging that combines general purpose functionality with specialisation, to drastically improve the system characteristics needed to support these new applications, at comparable cost.