by Rakesh Raman
January 2008. When Nasscom president Kiran Karnik passes on the baton to Som Mittal, Indian software mavens will have more than just a few things to think about. Inwardly, Indian software companies are well aware that the hype around them has begun to disguise the reality of their performance and future. Far from being a “software superpower”, as the spinmeisters spun the image, India barely gets a 5% sliver of the global IT services market estimated at $700 billion.
With Mittal at Nasscom, will hype get even further ahead of reality? Challenges abound. Today, our software makers have their eyes set on foreign lands, ignoring the domestic market that could be potentially useful for gaining a grounding of solutions that other emerging markets may be clamouring for as they emerge. By market estimates, Indian software companies earned just $7 billion in FY07 from local users, as compared to five times more money from foreign buyers-though performances at home as well as abroad are equally lackadaisical. So, can a “software superpower” afford to be powerless at home?
Even the Switch firms (moniker for six major software companies: Satyam, Wipro, Infosys, TCS, Cognizant, and HCL) have yet to prove their mettle, as these six together don’t get even 2% of the global services revenues. Individually, none of them has a share close to even 1% of the world market.
The trouble with most Indian software players is that they have failed to keep pace with time, and continue to follow archaic business models-behaving like bodyshoppers. In technologically advanced markets, they are not valued as highly as they claim. For example, local software exporters continue to serve on risk-laden fixed-price contracts as opposed to long-term and lucrative time-and-material business models. Reason: foreign software buyers treat Indian professionals as mere code writers, and not as intelligent analysts who can handle end-to-end consulting projects that integrate tech solutions with users’ core business goals. So skills, or the lack of them, have been a major worry. Though Indian tech education institutes turn out about 200,000 people every year, most of them are just not employable for high-end projects. Informal training schools are mostly in the business of capitalising on the hype by fleecing gullible job seekers.
And that’s not all. Software businesses also lack market understanding, and so fail to move up the value chain. Even the big software companies prefer to work as day-to-day traders with myopic vision, relying mostly on low-end services. Thus, they’ve failed to make any serious attempt to produce globally acceptable software products-a high-margin business.
The challenge is not just on product development. Software makers need to adopt contemporary distribution modes as well. Pay-per-use models like Software-as-a-Service (or SaaS) are getting more popular because they save capital expenditure for corporate buyers who want to pay only for what they use, just as they’d pay for water or electricity services. Leading companies like Microsoft (with Office Live and Windows Live), Salesforce.com (with on-demand CRM software) Google (with Google Apps) are already aggressive in the variable fee-based software sales arena.
Piracy is another irritant that deters companies from the products business. Of every 10 packages in use here, at least seven are stolen. Nasscom has been asking users to pay for software products, but its voice has gone unheeded. Mittal’s job won’t be easy. As Nasscom spreads its tentacles into such pies as business process outsourcing (BPO), it must confront new issues ranging from lack of human care at BPO centres to information vulnerability.
So Mittal will have lots to do once he takes charge. He can, of course, choose to become a mere spectator, letting things slide the way they have been. Alternatively, he can emerge as a true torchbearer for propelling the snoozing Indian software juggernaut into the modern business world that is full of opportunities, and put an end to the self-perpetuating schmooze session that Nasscom has been spearheading. Will he?
This article first appeared in The Financial Express, at http://www.financialexpress.com/news/Window-dressing/252500/0
While environmentalist are worried about the pollution being spread by electronic equipment, major initiatives are being taken on a global scale to check the menace
by Rakesh Raman
And you thought only vehicles on our roads are throwing dirty particles in the air. Even computers are notorious sources of pollution. Estimates suggest that the world over, the amount of ozone-depleting greenhouse gases emitted by tech equipment annually is equivalent to what a couple of million vehicles would do. If that’s not bad enough, computers waste half the energy they guzzle.
Thankfully, India is not a major contributor to this looming environmental doom because computer usage is so scarce. The local market for PCs is just around 5 million units a year—a mere 2% of global shipments. And the total installed base of computers in India is a piffling 15 million. Still, the country can’t afford to be just a spectator to green consciousness in the tech world. It needs to be a participant in any movement that pushes the electronic frontier forward. The positive fallout: tech users can save big bucks on energy cost as well.
On this front, the new Energy Star 4.0 specifications—effective from July 20, 2007, and suggested by the US Environmental Protection Agency (EPA)—promise significant energy efficiency for computer systems. The products that meet the new standards would be able to run on nearly half the electricity that conventional systems consume. The Energy Star initiative was conceived in the early 1990s to control energy consumption and greenhouse gas emission by power plants, and now it is being extended to cover IT gear. While the older version, 3.0, requires computers to consume not more than 15 watts when idle (referred to as “sleep mode”), the new standards allow them to use only 4 watts. As the compliant systems would use energy-saving power-supplying boxes, they are expected to be over 60% more efficient. Thus, these systems would not only help computer user Companies (and individuals) cut operating expenses, but they will also do their bit for global warming.
The imperative of energy saving and environment protection is so acute that several tech firms have joined hands to promote the use of energy-efficient computers and power management tools. Intel and Google, for instance, have roped in Dell, HP, IBM, Microsoft and others—including even the EPA—to start the Climate Savers Computing Initiative. With a 90% efficiency target for computer power supplies, the programme aims to reduce greenhouse gas emissions by 54 million tonnes per year and save over $5.5 billion in energy costs. Intel believes that by 2010, the joint initiative will help cut greenhouse gas emissions enormously—it will be like taking nearly 11 million cars off the road.
That is the need of the hour. Today, most businesses in emerging economies like India are facing a cost-linked energy crisis in running their corporate datacenters (big central locations to install main IT resources like hardware, software and networking equipment). Thanks to high energy costs, some say that it is cheaper for Companies to make a capital investment on the purchase of high-end servers than to run them for, say, a year. Now, such Companies can expect enormous cost-savings by deploying Energy Star-labeled machines. The power bills will be lower not only because of the machines but also the lower air conditioning requirement, as energy-efficient machines dissipate much less heat.
Though the benefits of the campaign led by Energy Star 4.0 will accumulate only slowly for Indian enterprises because they may not be ready to replace their systems rightaway, new tech users can consider this option seriously. Small and medium enterprises in the processes of acquiring technology, for example, can adopt energy-efficient computers. They would probably need to make marginally higher capital investments, but it would be worth it. The government can also encourage businesses to buy energy-efficient computers by offering some additional tax benefits on such purchases. Just remember: energy saved is cost saved. And that is the whole point.
This article first appeared in The Financial Express, at http://www.financialexpress.com/news/Green-solutions-for-epollution/209094/
Beware of total cost of ownership claims made by technology marketers
by Rakesh Raman
If a chief information officer (CIO) suddenly jumps out of a window, look outside for the lure – in all probability, you will find a tech marketer waving a reduced cost offer. Such impetuous behaviour, to those acquainted with the field, is not odd at all. To many, it is even ‘strategic’. After all, cost-saving is what CIOs have been justifying their jobs on, and this is precisely what vendors want to exploit. As with all such phenomena, there exists a reigning buzz-phrase that can be waved for magical effect at meetings and the like: Total Cost of Ownership (TCO). So hardened has its appeal become, that it is referred to as a sales tool. And, indeed, it is. It makes people across the table sit up and listen. Today, IT companies claim lower TCO on almost all their wares, including hardware, software and networking gear.
But is TCO an indicator of real gains —or is it just another sales gimmick to hoodwink the technology buyer? In the context of its usage domain, TCO is a financial estimate that is supposed to help enterprise users evaluate direct and indirect costs involved during the lifetime of an IT project. The total cost includes the initial investment and post-deployment expenses such as manpower training, administration, power consumption, and so on. Though companies are always keen to use this estimation to calculate their return on IT investments, it’s easier to get nice looking printouts quantifying the same than anything that will prove accurate over the project’s actual lifetime.
TCO analysis finds its genesis in the oil-shocked automobile industry, which strove to serve a market in which the buyer is curious about running costs (such as fuel consumption, maintenance and the like) of a vehicle after buying it. The IT industry borrowed the idea in the 1980s, to help companies select their infrastructure with a close understanding of the operational overheads of computerisation. However, it’s extremely difficult to calculate the TCO for IT equipment because the technology configurations are far more complex than in automobiles. Though a simple, yet crude, formula to calculate TCO is the sum of one-time cost and recurring costs over a period of time, it’s too cumbersome to estimate the recurring or operational costs during the life cycle of a tech product. It’s also difficult to specify the life of a product at the buying stage because technology and business needs change so fast.
When you buy a server platform, for example, for your corporate datacentre, it’s not only the hardware and associated costs that you consider. Rather, you need to take into account the cost of software that runs on it, its administration, downtime losses, security overheads, and other operational costs. You can’t ignore even the non-IT expenses like floor space, cooling expenses and power charges. And today, as a CIO, when you expect more bang for each investment buck, you need to consider the cost of capital as well.
Further, if your corporate information load increases suddenly, you might have to replace the server with upgraded technology even before its projected lifespan. All this makes TCO calculation a near nightmare for the CIO whose buying preferences are generally based on enterprise information management needs rather than on intricate empirical estimations.
Moreover, in the ever-evolving tech market, the conventional TCO is not even relevant in some cases. Companies that use software-as-a-service (SaaS), for instance, pay for usage rather than ownership. So total cost of use works simple and straight. Maintenance and administration costs are borne by the software service provider. In the absence of a standard method to calculate TCO, and since user companies can calculate the operational costs only after deploying the infrastructure, it makes little sense to use it as a basis for buying decisions. The decision should be on the basis of economic value or business benefits, not price. If there’s a clear, measurable advantage of some technology, buy it. Banks could go by how well the technology can manage complex transactions, for example, or how adaptable it is to Basel-II norms. Consumer product companies could look for supply chain efficiencies brought about by better information integration. In other words, the focus should not be on minimising cost, but on maximising enterprise benefits to stay on the cutting edge. Cynics, as Oscar Wilde defined them, rarely ever do.
This article first appeared in The Financial Express, at http://www.financialexpress.com/old/fe_full_story.php?content_id=1527
Next Generation Networks are no more in the embryonic state. Are enterprises ready to adopt them?
by Rakesh Raman
The networks of the 1990s were designed primarily to carry data across the enterprise. However, the information exchange requirements of enterprises have evolved over the years. Now they feel a pressing need to share other forms of information such as voice or multimedia along with conventional data on their digital networks. And the Next Generation Networks (NGNs) have emerged to satisfy this need.
A common market perception is that NGNs cater to voice-over-IP (VoIP) applications. That’s true to an extent. However, they offer more comprehensive connectivity options covering more information types besides conventional data -and voice. Driven by Internet Protocol (IP), an NGN offers a packet-based network with packets identified by their types-data, voice, or video. These NGN characteristics perfectly fill the bill for enterprises that are eyeing future technologies for their centrally controlled and geographically dispersed networks. “Large enterprises and SMBs are now investing or looking to invest in NGN architecture particularly IP based NGNs-that will drive convergence resulting in lower total cost of ownership and reduced network complexity for them,” says Sudhir Narang, Sr. vice president, SP and Government, Cisco Systems.
NGN Use: For Business Transformation
Today, most enterprises are striving to have consolidated infrastructure for all their IT needs. This is mainly because they want to simplify infrastructure management at reduced costs. As networks form part of the overall IT infrastructure, CIOs prefer to deploy a common network for their diverse information distribution needs. “Enterprises want all their applications to converge on a single infrastructure and distribute them across geographically distributed locations and even to the mobile workers,” agrees A Prasad Babu, systems engineering manager, Juniper Networks.
However, now these applications are not restricted to only traditional data distribution, but corporates want to introduce voice and video applications also on their networks. For example, VoIP is among the most crucial enterprise requirements today. Using converged networking offered by NGNs, companies could aim to transform their businesses by offering voice- and video-based value-added services to their customers. While offering Internet banking, for example, banks can even provide the facility of voice communication to their customers to interact with the bank staff over the same network. “Such a facility will help enterprises attract more new customers and retain the existing ones,” says Prasad Babu.
Similarly, using video-conferencing facility offered by NGNs, corporates can achieve more business agility by conserving time and cost, which they would normally spend on travel. But for all this, communication quality has to be exceptionally good. “As companies can’t afford to compromise with the quality of service (QoS) in today’s competitive business environment, NGNs can equip them to deliver relevant quality for all their communication needs,” says Prasad Babu. In the process, enterprises can also target higher productivity levels. As IP-based NGNs make a service available to end-users across any access network, a service available in the office can be made available seamlessly over a wireless LAN, a broadband connection, or a cellular network to even a roaming user. “This service agility drives greater productivity for enterprises,” says Narang.
Further, NGNs can give comfortable experience to the customers for interacting with the organization offering multiple communication channels. And, thus, more satisfied customers will translate into more business for the organization. “Enterprises can integrate new IP data, voice, and video applications over a single IP-based network infrastructure for increased profitability,” suggests Narang.
NGN Deployment: For Competitive Gains
While NGNs promise hefty returns to enterprises in terms of reduced costs and easy manageability, CIOs need to follow the right deployment strategies. For example, while selecting the networking platforms, they need to plan for the future-say, up to five years down the road. “With the security and efficient operation of an enterprise network at stake and with typical service-level agreements (SLAs) lasting for two to five years, selecting the right provider for the job is crucial,” suggests Narang.
For this, the entire onus will be on CIOs to aptly estimate their information requirements separately for data, voice, and video communications. While the conventional selection parameters such as scalability, availability, resilience, and QoS will hold true for NGNs. Enterprise tech chiefs are required to gain sufficient knowledge about the NGN options before zeroing in on a particular technology. “For this, CIOs can hold frequent discussions with the vendors even before floating their requests for proposals,” suggests Prasad Babu. Depending on their application environment, enterprises can even follow the managed services model. “Managed services or out-tasking to service providers can help businesses reduce overall network costs by 15-25%,” estimates Narang.
With an objective to offer an assured experience to the end-users, CIOs need to select only an application-aware network that could deliver right QoS. While security will hold paramount importance on converged NGNs, CIOs will be required to ensure security even for voice applications-along with data-to prevent unauthorized access to the network. As the basic objective of enterprise will be to run the new network operations smoothly, service providers will also have to contribute significantly. “Making the difference between 99% and 99.999% availability requires a significant effort by service providers,” says Narang.
There could be more transition challenges. As CIOs are supposed to handle migration issues while moving from their existing networks to NGNs, they need to carry out a definite analysis covering cost-benefit taking into account capex and opex factors, regulatory guidelines, and management requirements including manpower training. Since NGNs will result in total revamp of enterprise networks, there will not be any significant investment protection for the user organizations. However, they can achieve cost-reduction targets by deploying NGN-based converged networks.
NGN Flavors: Technologies and Applications
As NGNs are making steady inroads into the enterprise information environment, there are a few technology options that CIOs can consider. Since IP offers tremendous advantages to corporates, clearly IP-based NGNs will be the immediate choice. However, keeping an eye on their future demand, enterprises need to closely evaluate technologies based on multiprotocol label switching (MPLS), which primarily use IP backbone for information delivery. “MPLS-based NGNs will be highly suitable for enterprises planning to integrate data, voice, and video on a single infrastructure,” says Prasad Babu.
There are some advantages of MPLS for managing multiple forms of information. Since using MPLS technique, every IP packet is assigned a label to identify the user and information type, the router can ignore the IP address. As a result, enterprise users with overlapping IP addresses can communicate on the same provider network. It helps reduce cost of operations and simplify connectivity. Also using labeling techniques that define label switch path (LSP) and identifier for the IP address, MPLS ensures information security that can be matched with that of frame relay or ATM networks. The MPLS technology will be particularly useful for NGNs because it will help segregate data, voice, and video transmission. And if there’s a link failure, while transmitting one form of data, the dynamic routing protocols reroute the information flow to reach the destination almost instantly.
Thus, enterprises can expect exceptional reliability from the NGN infrastructure, which can create the platform for plenty of new-generation applications. “Application convergence opens the doors to all-media services such as videoconferencing, which is effectively a new service being neither voice, nor video, nor data, but an integration of all the three,” says Narang. “Applications such as IP telephony and videoconferencing drive down communication costs.” So, the NGN applications are particularly relevant for enterprises in the BPO, telecom, and BFSI markets that deal with large number of consumers. While NGNs can help enterprises leverage voice and video applications, their success will largely depend on regulatory guidelines.
But all these could just be some teething troubles. They shouldn’t discourage the enterprise CIOs looking for innovative technologies to strengthen their information infrastructure. It’ll be hard for them to ignore the benefits promised by NGNs. Right deployment approach will, of course, determine the success for new platforms.
This article first appeared in the Voice&Data magazine, at http://voicendata.ciol.com/content/NetworkingPlus/106011001.asp