Future of Indian Banking
Dear Arun,
Must Reserve Bank Pay Interest On CRR Balances? There has been passionate demand in some quarters that the Reserve Bank pay interest on the so- called cash reserves maintained with it by banks as part of its monetary policy. The current applicable Cash Reserve Ratio (CRR) is 4% of Net Demand and Time Liabilities (NDTL). But there is no logic and reason to paying interest on CRR because CRR is about impounding liquidity with a view to reducing M3 (Money Supply) through adjustment to reserve money/primary liquidity/high-powered/base money which can universally be created/withdrawn “only� by a central bank! Typically, under the fractional reserve banking, required reserve ratio (CRR) sets the theoretical maximum limit on broader money banks can create through making loans out of their deposits. Thus, if CRR be 4% of NDTL, as now, then the theoretical maximum M3 is 1/0.04=25 times the required reserves, i.e. the money multiplier (MM) is 25. But in actual practice, public preference to keep a chunk of their deposits in the form of cash/currency and banks choosing to accumulate excess reserves, act as a drain on broad money creation resulting in much smaller actual/empirical/observed money multiplier! Thus, currently the actually observed money multiplier (MM) is about 6, obtained by dividing the current M3 of about Rs 90 trillion ( currency worth Rs 12 trillion and deposits worth Rs 78 trillion) by Reserve Money of about
Rs 15 trillion (currency worth Rs 12 trillion and CRR worth Rs 3 trillion) . It is not that RBI cannot, and should not, pay interest on CRR as demanded in some quarters. Only that to have the desired extent of impounding to influence aggregate demand in the real economy, M3 will still need to be contracted by a certain amount. What will happen if interest is paid is that the effective/ substantive contraction/ withdrawal of money will be less to the extent of interest - reserve/ primary/base money - paid by RBI and so CRR will have to be so much higher than, say 4% at present, to achieve the required contraction in M3 (Money Supply) to achieve the required compression in aggregate demand in the real economy ! More generally, if C% be the CRR without interest payment by RBI and i% be interest to be paid by RBI, and C1 be the new/corresponding CRR accruing interest at i%, then interest accrued on C1 will be i/100*C1 and so effective impounding with payment of interest will be (C1-i/100*C1)=C1(1- i/100) which must equal the policy neutral objective/goal of C i.e. C1(1- i/100)=C or , C1= C/ (1-i/100). So, if i= 6.75 % (Current Reverse Repo Rate) and C=4%, as now, then the required CRR C1 with payment of interest will be: C1=4/(1-0.0675)=4/0.9325=4.29% . In other words, the choice is simple: either have monetary impact neutral CRR of 4% without payment of interest, or have CRR at 4.30% with payment of 6.75% interest by RBI!
a
m vkshar
Happy reading
Executive Editor Banking & Finance Management
IMPRINT
Quotes
EXECUTIVE EDITOR Mr. V K Sharma Ex. ED - Reserve Bank of India
E D I T O R & P U B L I S H E R* Mr.
David Chelekat
EDITORIAL BOARD
What matters finally is not knowing what must be done but actually doing what must be done and doing it when it must be done!
- V. K. Sharma Bankers are the largest class of unpunished criminals in the country
- J. Howard Shelov Life is tough but it’s tougher when you’re stupid. - John Wayne Income tax returns are the most imaginative fiction being written today. - Herman Wouk If there’s a 50/50 chance something can go wrong then nine times out of ten it will.
- Paul Harvey
Mr. Sudeep Bhandopadhyay MD- Destimoney Mr. Amol Vidwans Group CIO, Balaji Infrastructure Group Mr. Sunil Rawlani Partner, Metamorphosys, Ex CIO, HDFC Life, Mr. Anjan Choudhary CIO, Universal Commodities Exchange Mr. Jayantha Prabhu Group CTO, Essar Group Ms. Alpna Joshi CIO, Reliance Comm. Mr. Indranil Ghosh DGM - IT - Alt. Channels, ICICI Bank Mr. Dhiren Savla CIO, VFS Global Mr. Thomson Thomas CIO, HDFC Life Insurance Mr. Ram
Kalyan Medury
Vice President - IT, ICICI Lombard Mr. Umesh Jain Sr. Vice President - IT, NSE Mr. Harish Shetty Sr. Executive VP - IT, HDFC Bank Mr. Sameer Ratolikar Sr. VP and CISO, HDFC Bank Mr. Milind
Tandel
Head - IT Procurement, Barclays Bank Mr. M
K Madhavan
CTO - Indusind Bank Mr. N
Jambunathan
Chief GM - I T, SBI Mr.
Samir Khare
Vice President - IT, Fullerton India
“I’m one of those that believes you can’t be one kind of a man and another kind of leader.”
- Dr. Phil McGraw “Expect the best. Prepare for the worst. Capitalize on what comes”
- Zig Zagler “Our business is about technology, yes. But it’s also about operations and customer relationships.”
- Michael Dell
ADVISORYBOARD Mr. Dewang
Neralla S S Subramaniam Mr. Alok Tiwari Mr. Chetan Dighe Mr. L
Mr. Vishal
Salvi
MANAGEMENT Transact360
MARKETING & SALES Footprint Media
RESEARCH Antz Consulting
ENGAGE i3rGlobal
“Do the hard jobs first. The easy jobs will take care of themselves.”
- Dale Carnegie “Are we going to be a services power? The double-cheeseburger-hold-the-mayo kings of the whole world?”
- Lee Iacocca “The way to make money is to buy when blood is running in the streets.”
- John D. Rockefeller
Pegasus
EDITORIAL Transact Media
DESIGN & LAYOUT T360Design
CIRCULATION 1SourceMedia
REGISTERED ADDRESS 5/4, Assisi Nagar, P L Lokhande Marg, Chembur, Mumbai 400 043
MARKETING & MAILIN G 7/17 Assisi Nagar, P L Lokhande Marg, Chembur (W), Mumbai. 400 043 This magazine is an aggregation of knowledge to help industry make better decisions. The magazine is printed at Ramabhai Press, Vashi, New Mumbai and published by David Chelekat for Transact360. All views are of the authors and solicited advertisers. Readers are requested to make appropriate inquires before buying into products and services. Transact Media/Transact 360 is not responsible for steps take by readers based on claims/performance of products as made by any advertiser or writers. Readers are requested to qualify all claims independently. *Responsible for news under PRB act
BASEL norms Risk
The organizations leadership should take a hands-on, approach to risk management, and make sure its goals and objectives are communicated from the top down.
The underlying rationale in implementing the Basel Committee norms was to insulate the banks from any sort of adverse shock, a point still in debate.
Why leaders fail The distance between beloved leader and despised failure is shorter than what most people think!
The D’factor
The mantra for a smart organization is not always being lean in size, but lean and more decisive.
Rebirth Verticalization
All business are always looking for opportunities to increase efficiency and effectiveness. this is achieved through better practices, one of the more critical ones is better human capital investment and returns on that investment. After all, your business is only as good as the people who manage the business.
Relax
One of the biggest risks in business is the fact that your star performer may burn out. How do you avoid this is critical to the success of almost every organization.
The branch of the financial services organizations has time and again proved to be a show of strength for the industry, having build habit forming relations.
Human capital
Companies will eventually select service providers that deliver the best quality and value-add, not just the best price. There will be increased adoption of standardized solutions in outcomesbased models will drive outsourcing to become more utility based.
T
o use a cliché, there will be a tectonic shift in the Indian banking landscape in 2014. Both the body and the soul of the Rs.80 trillion banking industry will change. A set of new banks will get the regulator’s approval, some foreign banks operating in India may decide in favour of local incorporation to get near-national treatment, and new norms for early recognition of financial distress and faster resolution and recovery will help curb rising bad assets and improve the health of the banking system. As an offshoot, Indian firms, especially those operating in the infrastructure space and adding to the growing pile of bad loans at banks, will become more aggressive in selling assets to shrink debt. In 2013, close to a dozen Indian companies either sold or made their intention clear to sell assets, to pare at least Rs.3.5 trillion worth of debt, as rising interest costs and diminishing margins took their toll on growth. This trend will intensify in 2014. Following the nationalization of 14 large banks in 1969 and six in 1980, RBI has so far given licences to only 12 banks in two phases, including the conversion of a cooperative bank into a commercial bank. In the past, RBI’s stated objective behind giving licences to new banks was to introduce competition in the sector, largely dominated by government-owned banks. This time, the prime focus is to promote socalled financial inclusion, or increasing the reach of financial services to the unbanked population. Then finance minister Pranab Mukherjee, in his February 2010 budget speech, had announced that RBI would open up the sector and issue fresh licences with the objective of spreading banking services wider in a nation where roughly 50% of the adult population does not have access to them. After issuing a discussion paper and receiving public feedback on it, RBI issued the guidelines on new banking licence in February 2013 and set a 1 July deadline for applications. A panel of four, headed by former RBI governor Bimal Jalan, has been scrutinizing the applications. IMF’s financial access survey of 2011 gives us a fair idea about how critical financial inclusion is in India. In every 1,000km stretch, India has 30.43 bank branches and 25.43 automated teller machines (ATMs). In contrast, China has 1,428.98 branches and 2,975.05 ATMs. Similarly, there are 10.64 bank branches and 8.9 ATMs for every 100,000 of the population in India. The comparable figures for China are 23.81 and 49.56. Finally, bank deposits in India constitute 68.43% of the nation’s gross domestic product (GDP) and
credit 51.75% against China’s 433.96% and 287.89%, respectively. To expand banking services in a nation of 1.2 billion people, one needs deeppocketed promoters, and this is why corporations have been allowed to apply for banking licences, but not too many of them seem to be interested. According to the licensing norms, the new bank will have to be listed within three years, bringing down the promoters’ shareholding to 40%. Within 10 years, this holding must be further pared to 20%, and by the 12th year to 15%. This is a big deterrent as the promoters will not be able to reap the benefits of the value they create. How many licences will be given is anybody’s guess at this point, but one thing is for sure: armed with technology, the new banks will shift the playing field from the cities to rural India and add a new dimension to the rural consumption story. At a parallel level, some old and big foreign lenders may set up wholly owned subsidiaries in India because they will get “near national” treatment by the regulator when it comes to opening branches. Foreign banks with “complex structures” and banks that do not provide “adequate disclosure” in their home jurisdiction as well as “systemically important” ones will have to convert their local units into subsidiaries. Systemically important banks are those whose assets account for at least 0.25% of the total assets of all commercial banks. At least 12 foreign banks, fall into this category. Those banks that started operations in India before August 2010, however, have the option to continue their business through the branch mode, but they will be “incentivized” to follow the local incorporation route. One of the incentives is allowing foreign banks to buy private sector banks in India. If indeed that happens, banking will never be the same in India. While new banks and locally incorporated foreign banks will rewrite the rules of the game, RBI’s initiative to clean up bad loans will add strength to the banking system. The combination of bad and restructured loans is at least 10% of total banking assets in India. The new norms will give incentives to banks to detect the first sign of a loan turning bad and take remedial steps and, at the same time, they will make life difficult for rogue borrowers. At the next stage, RBI will probably focus on reforming state-run banks that account for about 70% of banking assets, but lack the skill to manage them and aren’t smart enough to say no when it comes to taking exposure to some sectors. Overall, 2014 will be action-packed; banks cannot ask for a more exciting time
Banking in India is poised to enter a huge growth phase with the expected entry of 25 new bank license holders. This however is expected to impact industry with a huge shortfall of qualified manpower. So how will industy map these requirements and still deliver customer service to retain its customers. If they do not do this, they stand to lose not just employees, but customers and along with this their top line and thereby bottom lines...
F
Natarajan Chandrasekaran
Chief executive officer and managing director, Tata Consultancy Services
N Chandrasekaran is Chief Executive Officer & Managing Director - Tata Consultancy Services Ltd. He holds a Master’s degree in Computer Applications and has been with TCS since 1987 serving in several capacities including head of global sales and chief operating officer. He drives innovation within TCS through the collaboration route creating apps for use by internal employees in improving their quality of life, thereby improving productivity and the overall health of the organization.
our of the most senior people in TCS are spying on N Chandrasekaran, the CEO. Each mile he runs, each marathon he participates in and the fitness schedule he keeps. Chandra doesn’t mind. He is watching them too. He slides open his iPhone and brings up a nifty-looking application. “We have developed this application called Fit 4 Life,” says Chandrasekaran. It allows TCS employees to form groups of 5, 10 or 15 to keep track of their running, swimming, or even brisk walking. The app helps co-ordinate and also put a bit of peer pressure to make sure everybody remains in the prime of health. “It also helps people bond,” says Chandra. Right now, just a handful of people are using this app, which is in the pilot stage. Eventually, the whole company will get to use it. Fit 4 Life is just one of the 25 apps that TCS is developing. There is an app that will make the drudgery of filling timesheets—an everyday chore for IT employees—a breeze. Then there is another that will allow people with some commonality—either they stay in the same locality or have gone to the same college—to discover each other. This app mania inside TCS isn’t a fad. “The future is on the smartphone. How can TCS develop the capability to operate in this new world that is based on smartphones, apps and of course young people?” asks Chandra. He has thrown the company into the digital deep-end. There will be some who will flail and scream but most will eventually learn to float and swim. Chandrasekaran is on the next gamechanging project, codename: Vivacious. Designed to make TCS ready for the social media, allow easy collaboration across the company, and bridge the gap between youngsters who are very wellversed with the smartphone and the app world. Vivacious will also allow TCS employees to feel more connected to each other and lessen the feeling of a mammoth, cold organisation most large companies exude. On top of all these, it could convert TCS into a mega pilot project that Chandra can showcase to customers for more business. The promise: Develop skills to thrive in the new digital era. Ask the Question The Chinese did not say this but the journey to the big question begins with usually one that is much smaller. “How many people do you know?” asks Chandrasekaran. Well...300? Make that 700. After all there are three of us in the room fielding that question. “Has it occurred to you that today you come in contact with many people who you don’t know or will know for a brief moment but all those interactions are valuable?” he persists. And that leads us to the question, a slightly bigger one: “How can an organisation use the power of the not-so-familiar or the mildly strange?” That might sound like a paradox. An organisation is after all a familial unit. There are no strangers. Now think about TCS. This organisation has 2,50,000 employees across the country. In Mumbai alone, it has 20 offices. Its corporate HQ in the city has more than 300 people, while 19 other offices have about 1,000 people each. Its largest office in Siruseri, near Chennai, has about 25,000 people. It is all very well to group such a dispersed workforce in “small, empowered units” and get them to execute a plan. A bigger challenge is to get the same organisational units to look for new, emerging trends. Any unit is prone to becoming rigid in its way of thinking and a successful unit even more so. What British Airways Did Chandrasekaran knows that TCS may be at a juncture where once again he has got to get the organisation to “think big” and not just “execute better”. He saw British Airways take such a step last year. This airline decided to give its cabin crew iPads. Before they boarded the plane, their tablets were loaded with data related to passenger profiles, journey details, seat plans and so on. The response, as recorded by some media outlets, was good. “I’m ahead of myself in
knowing where our corporate and highvalue customers are sitting, and who needs help,” Bloomberg quoted one BA flight attendant as saying. “They look at you and say ‘have you been on a special course?’” For Chandra, the BA initiative represented a big shift that’s happening across the industries. “The point is this: By making information available to everyone, you are in effect making everyone an aircraft manager. They can now take decisions because they have information. In effect, you are putting management in the plane,” he said. That big shift—of information flowing freely within organisation —will be driven by the emerging technologies: Social, mobile, games and so on. These cannot be built in the traditional way. (The traditional way would be to write a project proposal with 2017 as deadline, he says). Instead, he created a group of 100 people, and asked them to build mobile apps. Chandra knows that the guys in the company who will intuitively understand this are the youngsters, 70 percent of TCS’ employees. He also knows that such technologies and the nature of application development for these platforms—apps—is inherently different from developing core banking software. If banking software is like making a bespoke suit, apps are more like Zara. They need a freshness of approach and a willingness to change the software on the fly, based on customer feedback, of course. So while TCS cannot become like a startup—it is too big for that—it needs all the youngsters on board. It needs their enthusiasm and their way of looking at problems. “We wanted to enhance our ability to sense and adapt and we wanted to let the liveliness of our GenY workforce loose within the company,” says Hasit Kaji, vice-president, TCS, and the pointman for the project. Kaji had help from Krish Ashok, head of Web Innovation 2.0, TCS. Among certain groups online, Ashok enjoys the status of a minor genius—for giving a ‘Tambrahm’ twist to the Rage Guy meme, and for funny music compositions. Learning to Collaborate Ashok believes organisations have three primary challenges for seamless collaboration across an enterprise. The first is people themselves. “People still don’t break out of silos and share. Very often workflows are designed so there is no access to others in an organisation,” he says. The second challenge is the gulf between youngsters and the old fogeys. “If you introduce something like a Facebook or a LinkedIn the younger employees will adopt it and the older employees will see it as a threat. The
older people also want their work and social life to be kept separate,” says Ashok. The third challenge is that people are wedded to their email. “The email is an ancient technology. It is linear and not at all conducive to collaboration.” So TCS now has an internal Twitterlike platform. Employees can ‘follow’ anyone inside the company. They post their questions, make suggestions or simply ask for help on the ‘wall’ of that person. When someone answers the query or responds to a help yelp, he accumulates ‘Karma Points’. The system automatically selects people with a certain number of
Karma Points to become moderators of posts. This is actually a coming together of separate platforms inside TCS that are about four years old. One of them was Justask, which was a technology question board, then there was an internal Craiglist-like portal where people trade stuff or exchange information and Ideamax, an innovation marketplace. This internal social media was quite useful in the years after the financial
crisis of 2008. “Very often, you would have guys going on about increments or promotion on these portals. They suddenly had our HR head directly addressing their queries,” says Krish. Just the satisfaction of having someone senior take their questions directly made them feel better if not totally happy. That has evolved with people now asking for help on something as small as a fixing a macro in an application to seeking help reducing energy consumption in a data centre. Now senior management posts challenges on this platform. The best response to a challenge wins an iPad and such stuff. That Sync-ing Feeling Chandra has been happy with the results so far. He already has ideas on how the one app that he uses most frequently—Fit 4 life—can be made better. “There is a lot more that one can do with this app, but which we haven’t done. For instance, can we say there must be gender diversity in the virtual groups? Or one could say there must be at least one member from another centre,” Chandra said. The idea is to open this for all its 250,000 employees. If they have an idea, they convert it into an app, and if an app is good, they get the votes from peers in form of downloads/usage. Eventually, the apps will be the primary way in which the organisation interacts. But the path itself is interesting. In effect, this means tapping the creativity of everyone in the organisation; it means not having to go through the normal circle of planning, budgeting, monitoring; it means bringing an organisation up to the speed of what’s already happening in the world of tech. The first version of Facebook was written in a dormitory. It took just 18 months for Instagram to get a billion dollar valuation. Twitter grew from a brainstorming session at a podcast company: Work on the project started in March, and the product was launched by July. ‘Project Vivacious’ is still in its pilot stage. It is yet to be thrown open. “Right now, it’s fuzzy. But it’s better to be fuzzy. If you have clarity, it means you already have an answer,” says Chandra. But there is one answer that is not fuzzy at all. The most popular ideas are not necessarily the best. What’s been the most popular idea till date? Do away with offices and to let everyone work from home! Nobody at TCS thinks it is going to be adopted any time soon. This article appeared in the Forbes India magazine of 12 October, 2012
How We Can, and Cannot, Reduce Gold Imports By V K Sharma
Former Executive Director; Reserve Bank of India
H
onorable Finance Minister deserves unreserved compliments for explicitly and unequivocally acknowledging in his budget speech the imperative of reducing gold imports which were the single cause of the unprecedented 30+ year-high current account deficit of 4.7% of GDP in FY 2013 with gold imports accounting for 3%+ ! The main reason for this was that under the then current import regulations(cf RBI’s FEMA Master Circular), there was preferential, and more favorable, treatment for gold imports as compared to import of any other item, including essential imports, in that import of gold was permitted 1) On consignment basis 2) On unfixed price basis and 3) Metal loan basis. This preferential treatment made gold imports free from both price and currency risks for overseas consignors of gold borrowed overseas as effectively it amounted to their short selling such borrowed gold in India and simultaneously covering their short position abroad, with both the risks being borne by end buyers of jewelry and gold ETFs ! This was not the case though with other items, say, coal, edible oils etc,
imported on direct import basis. I had, in December 2012, accordingly suggested to the then Governor, Dr Subbarao and the present Governor Dr. Raghuram Rajan, then Chief Economic Advisor, Ministry of Finance, to create a level playing field between gold and non gold imports by aligning gold import regulations under FEMA with those for non gold but essential imports. And indeed, aligning gold import
Aligning gold import regulations with the rest of imports by RBI had the desired effect of taking away this significant, unwarranted and perverse incentive and delivered the desired outcome of dramatically reducing gold imports in double quick time, merely by creating a level playing field between the two kinds of imports so much so that, as its direct consequence, the current account deficit narrowed dramatically to 1.7% of GDP in FY 2014!
regulations with the rest of imports by RBI had the desired effect of taking away this significant, unwarranted and perverse incentive and delivered the desired outcome of dramatically reducing gold imports in double quick time, merely by creating a level playing field between the two kinds of imports so much so that, as its direct consequence, the current account deficit narrowed dramatically to 1.7% of GDP in FY 2014! And this was achieved, as I said, not by imposing curbs and restrictions on gold imports, as was widely made out in media campaign by some quarters, but merely by creating a level playing field between gold and non-gold imports, by aligning gold import regulations under FEMA 1999 with those for non-gold, but essential, imports like, as I mentioned above, edible oil and coal! But surprisingly, RBI only recently rolled back these entirely wholesome and fair measures and no wonder, according to media reports, gold imports have again surged and the current account deficit for the latest fiscal quarter has increased to 2.1% from 1.2% of GDP! Against the background as somber as above, to reduce gold imports, Honorable Finance Minister has proposed in his Budget three schemes and his verbatim statement on which in
the Budget Document is as follows: ‘India is one of the largest consumers of gold in the world and imports as much as 800-1000 tonnes of gold each year. Though stocks of gold in India are estimated to be over 20,000 tonnes, mostly this gold is neither traded, nor monetized’. I propose to: (i) Introduce a Gold Monetisation Scheme, which will replace both the present Gold Deposit and Gold metal Loan Schemes. The new scheme will allow the depositors of gold to earn interest in their metal accounts and the jewelers to obtain loans in their metal account. Banks/other dealers would also be able to monetize this gold. (ii) Also develop an alternate financial asset, a Sovereign Gold Bond, as an alternative to purchasing metal gold. The Bonds will carry a fixed rate of interest, and also be redeemable in cash in terms of the face value of the gold, at the time of redemption by the holder of the Bond. (iii) Commence work on developing an Indian Gold Coin, which will carry the Ashok Chakra on its face. Such an Indian Gold Coin would help reduce the demand for coins minted outside India and also help to recycle the gold available in the country”. The proposed Sovereign Gold Bond Scheme is, in its design and logic, exactly like a Gold ETF (Exchange Traded Fund) with the difference that the latter doesn’t pay any interest but delivers gold returns to investors by investing the entire proceeds of subscription in metal gold which is held in demat form! Significantly, all the 14 Gold ETFs in India, between them, hold no more than 40 to 50 tonnes of gold, which is a mere 5% of annual gold imports of 800 to 1000 tonnes mentioned by Hon’ble Finance Minister! So a Gold ETF does not at all reduce metal gold demand and all it does is substitute gold demand from individual metal gold investors with like demand from a professionally and expertly managed gold ETF instead! This is exactly the case that will develop, if the Government does the same to deliver gold returns to Sovereign Gold Bond holders on redemption! In the highly unlikely event of Government even remotely contemplating using the subscription proceeds for a purpose other than investing only in metal gold, will expose it to extreme price risk, as it will be committed to delivering gold returns to bond investors on redemption at the ruling gold price! This is important because, such an action will amount to Government incurring huge short position in gold, potentially exposing the exchequer to any gold price
increase! A sense can be had of the enormity of this proposition by considering the fact that gold price moved all over from a then all time high of $850 per oz in late seventies to a life time low of $270 in the late nineties and then back up to an all time high of $1920 in September 2011!! Assuming the worst case, and hence the most prudent and conservative scenario, of gold price rising from its recent low of $1180 per oz to its all time high of $1920 (a 65 % increase) due to possibly because of the potential monetary stimulus in Japan and Eurozone, the resulting shock could be fiscally overwhelming and way too disruptive. The rationale for this is that if we again assume conservatively that Gold Bonds would attract the equivalent of 1000 tonnes of annual gold import demand, it will translate into the rupee equivalent of INR 1.5 lakh crores ($ 25 billion) of net loss on redemption not counting the interest paid, with all else being equal, increase fiscal deficit by
The proposed Sovereign Gold Bond Scheme is, in its design and logic, exactly like a Gold ETF (Exchange Traded Fund) with the difference that the latter doesn’t pay any interest but delivers gold returns to investors by investing the entire proceeds of subscription in metal gold which is held in demat form! like amount because of subscription proceeds not being invested fully in metal gold! Any multiple of 1000 tonnes will have multiplier effect and in the extreme scenario of the theoretically maximum subscription equivalent to the entire metal gold stock in India of 20000 tonnes , will translate into a whopping net loss of around INR 30 lakh crores ($ 485 billion) to the exchequer on redemption at the assumed ruling price of gold of $1920 per oz and ,as stated before , will , all else being equal, increase fiscal deficit by like amount, representing 20% of the current GDP! Not only this, large investors in Sovereign Gold Bonds could also cause the so called short squeeze, close to the due date of redemption, to profit at the expense of Government by speculatively pushing gold price higher fully aware that the Government is hugely short in gold , making the above mentioned whopping net loss to the exchequer even worse! Contextually, in February 2013, someone from Commodity Exchange space, who should have known better, had proposed a variant on the theme
of the above Gold Bond Scheme. He suggested that the subscription proceeds of the sovereign gold bonds be invested in infrastructure projects and to deliver gold returns to bond holders without price risk and loss, Government may hedge its price risk by buying gold call options! But this hedging proposition , involving buying gold call options covering ,as stated in the preceding paragraphs, 1000 tonnes worth of annual gold import demand, while tenable in theory of options, will be untenable in practice as this will inundate call option writers/sellers and result in call options getting deep-inthe-money because of resultant price increase way above the starting strike price at which Government will buy these call options, pushing the delta hedge ratio to almost 1 which would mean call options writers will have to physically buy almost 1000 tonnes of the metal, leaving the physical metal gold demand in the domestic market pretty much the same, if not more, and which anyway, as now, will only be imported! This will also apply just as much to Government seeking to hedge against gold price increase by buying gold futures because given the sheer size of the the hedging demand, call options and futures will both have the hedge ratio of 1 meaning that the demand for the metal gold will be about 1000 tonnes; only that it will shift from Government to the metal gold market and which, if not supplied in the domestic market will, as now, have to be met through imports! This is the very basic theory and practice of futures and options hedging and pricing which is governed by the so-called no-arbitrage argument or, what is the same thing as, the law of one price. In other words, there can be no free lunch or the case of eating one’s cake( investing cash proceeds of gold bond sales in projects and not in gold) and having it too( replicating and delivering gold returns without investing in physical gold)!! In a lighter vein, if only to dramatise to best effect , the analogy of futures and options hedging with a dialogue of the character Gabbar Singh in film Sholay is most apt and I quote: “Oh village folks if there is anyone who can save you from Gabbar, he is none other than Gabbar himself!” So also “If there is any asset which can deliver gold returns with gold price risk, it is none other than gold itself!” Of course, the same holds for any other asset like equity stocks, bonds, foreign currency, real estate etc! So to conclude, the proposed Sovereign Gold Bond Scheme will not deliver in practice! Turning to the other Budget proposal on the so called monetisation of 20000 tonnes of domestic stock of gold which, as Hon’ble Finance Minister very rightly observed in his speech, is mostly
neither traded nor monetised, I will begin with his own statement that the domestic metal gold stock is mostly not traded. Of course, globally, deep and liquid metal gold deposit and lending markets exist but only for two reasons; the first being active trading and the second gold miners borrowing metal gold to raise money at ultra low interest rates to fund their capital investment in either new mines or in expansion of their existing mines. And there is no third reason for this! The dynamics of the global gold deposit and lending markets in New York, London, Singapore and Hong Kong involves gold borrowing demand coming from short sellers and large gold miners. In particular, short selling arises from speculators, hedge funds and other participants betting on declines in gold price so that they can make profit from their bet coming right by buying back gold at a price lower than the price at which they originally short sold and repaying the metal gold loan! But since there is the market discipline of delivery into a short sale, short sellers have necessarily to borrow metal gold to deliver into the short sale which gets adjusted after some time when short sellers either buy metal gold back to cut their losses due to stop loss limits or to book their profits and using the gold so bought back for repaying the borrowed gold! Another reason for short selling metal gold is engagement by market participants in cash futures arbitrage when gold futures are cheaper relative to the spot market i.e. when noarbitrage argument /law of one price of derivatives is violated . What then they actually do is buy cheaper gold futures and short sell expensive gold in the spot market and carrying the arbitrage trades into maturity and earning totally risk free profits due to such mis-pricing. Such arbitrage trades continue until, as a result of these trades , such price discrepancy eventually disappears ; the faster this happens, the more efficient the markets ! Since globally there is large scale demand for such borrowed metal gold, supply comes from gold deposits and, like in any bank deposit and loan market , there are deposits and lending/borrowing rates known as gold lease rates and these are way too low compared to currency rates because they are in theory and practice nothing but the difference between the uncollateralised currency interest rates and and those collateralised by metal gold. Specifically, when gold is short sold, the sale proceeds of short sale are used by the short sellers to collateralise their borrowing of metal gold borrowing which amounts to lending by short sellers of cash from short sale to metal gold lenders at an
interest rate lower than the interest rate they will lend at otherwise; the more desperate the metal gold borrowers are to borrow, the less interest rates they will charge to the lender of the metal gold and so much higher will be the metal gold lease rates! On the other hand in an aggressive bull market in gold when prices shoot up , the demand for borrowed gold will relatively be far lower and ,therefore, gold lease rates will be lower and can be, and have indeed been, negative! These gold lease rates are in both theory and practice a fraction of a currency‘s interest rates! Illustratively, currently these are 0.09%, 0.11%, 0.15%, 0. 25% and 0.40% for 1 month, 2 months, 3 months, 6 months and 1 year, respectively! Incidentally, in the Indian context, there
Proposed Ashoka Chakra Gold Coin
The dynamic of the global gold deposit and lending markets in New York, London, Singapore and Hong Kong involves gold borrowing demand coming from short sellers and large gold miners. In particular, short selling arises from speculators, hedge funds and other participants betting on declines in gold price so that they can make profit from their bet coming right by buying back gold at a price lower than the price at which they originally short sold and repaying the metal gold loan! has been frequent demand from some quarters that interest rates paid on gold deposits be higher than currently paid without appreciating the fact that each asset has its own yield curve like for example just as you cannot pay
Indian Rupee rates on Yen and Dollar deposits, so also you cannot pay Indian Rupee interest rates on gold deposits! Coming back to gold lease rates, they have also been negative during the episodes of aggressive bull market in gold when lenders of metal gold far exceed the borrowers of gold to a point that lease rates become zero but there are still storage and insurance costs which make lease rates negative! As mentioned above, other than short sellers and arbitrageurs there is only one more kind of metal gold loan borrowers, namely, large gold miners who borrow metal gold for longer periods to sell the metal in the spot market and use the sale proceeds to fund capital investment in new mines or in capacity expansion of their existing gold mines and have a natural hedge against gold price rise as they use the metal gold mined to repay their metal gold loans without any price risk and thus have the natural advantage of raising capital at a fraction of cost of debt and equity capital unlike other non gold businesses! As regards lending gold to jewelers, to call it a metal gold loan is an oxymoronic misnomer because, effectively and substantively, it is nothing but a sale and purchase transaction as, both in finance theory and practice, a deposit/lending transaction is where whatever is borrowed has to be repaid and to that extent it is a no-brainer to see that the proposed monetisation scheme will not deliver, because any depositor of any asset would want the deposit back on redemption and maturity date and not sell it in the first place! To conclude whether the proposed gold deposit and lending scheme will deliver in practice will depend critically on whether we have in India both active gold trading involving ,as I said before, short selling and arbitraging and gold mines of global scale! And as we already know the above necessary and sufficient conditions are not satisfied in the Indian context. In view of the foregoing reasons, both the Sovereign Bond Scheme and Gold Monetisation Scheme will not deliver! What, however, will is, what actually and effectively did only as recently as in 2013-14, that is, restoring by RBI of the status quo ante by re-creating a level playing field between gold/non-gold imports by stopping gold imports on 1) Consignment basis 2) Unfixed price basis and 3 ) Metal loan basis! The opinions shared by the author, V K Sharma is ex-ED, RBI, is his own. Transact360 is in no way responsible for his comments. You may reach him vide e-Mail: vksvs2009@gmail.com or call +91 9820 267 474
Ravish Metha
A
bank’s leadership should take a handson, proactive approach to risk management, and make sure its goals and objectives are communicated from the top down. In good times and bad, an effective risk management program can be critical in determining a company’s success. While regional and community banks may have avoided the worst of the credit debacle as compared with their commercial banking counterparts, the recessionary environment poses plenty of pitfalls in the months to come. Default rates likely will escalate as the economy continues to weaken and unemployment increases. Regulators, too,
have their sights set on all financial institutions. As they attempt to weather the current storm, all financial institutions should be taking a critical look at their operations and risk management processes. As challenges in the marketplace continue to multiply, a focus on strong risk management will serve banks well. Although the causes of the credit crisis are complex, the results of a recent risk management in banking survey found risk management breakdowns in many areas: weaknesses in risk culture and
81% of survey respondents reporting that risk management is viewed by their bank as a competitive advantage, a vast majority (76%) still find that risk is stigmatized as a support function.
Enterprise risk
governance; gaps in risk expertise, both at the board and operating level; a lack of influence of the risk function; a lack of responsibility and accountability by those on the front lines; a focus on short-term gains at the expense of longterm foresight; and an overreliance on models that could not foresee extreme events. In particular, the survey findings revealed several areas of risk management that banks should examine in an effort to emerge from the current environment in a position of strength. Despite improving its profile, the risk management function still struggles to gain influence. 81% of survey respondents reporting that risk management is viewed by their bank as a competitive advantage, a vast majority (76%) still find that risk is stigmatized as a support function. It is encouraging to note that chief risk officers, or their equivalent, are starting to have greater authority in strategic
risk decisions, people do.The risk culture is where risk management problems often start. In fact, almost half (48%) of banking executives cited risk culture as one of the leading causes of the credit crisis. Senior level “tone” can have a significant impact on risk management, affecting an organization’s governance, risk appetite, limits, compliance, and control. Leadership must establish an appropriate, enterprisewide framework within which risk can be measured, reported, and managed. Companies must also understand how much risk they are willing to accept in pursuit of strategic objectives. The goal is to create a risk management framework where the appetite for risk is clearly defined within the parameters of the organization, and where everyone in the organization understands the risk appetite and has a role in managing risk.
models did not sufficiently measure potential risk exposures. Organizations need to make sure they are collecting data that is of sufficient quantity and quality to provide a true picture of changing levels of risk. Strong performers will use adaptive, rather than static, processes to measure risk, allowing them to factor in even dramatic changes in trends as they occur.While historical data have regularly been used to predict the future, we now know that contemplation of unforeseen shifts in conditions is just as important. According to survey results, future emphasis will be on stress testing and scenario analysis, but only time will tell if such measures will be either flexible or sophisticated enough to fully capture the range of possible outcomes. Regardless, a shift toward contemplating both deterministic and stochastic modeling results, and recognizing that qualitative information
development and capital allocation, but it is not clear that this trend has filtered into all areas of the business, such as mergers and acquisitions. An active risk committee supporting the board can be an effective mechanism for raising the profile of the risk function, yet 25% of survey respondents see no need for this body.While the size and complexity of an organization may dictate committee form, it seems that without such a body, banks could be lacking in a rigorous, independent challenge to the judgments and decisions being made at the business level.Until the risk management process becomes embedded into the strategic framework of the company, banks will continue to run the risk of ignoring vital information that can help guide critical business decisions. To develop a more robust risk culture, banks must lead from the top. Financial models don’t prevent poor
Only when senior leadership drives the risk culture will the motivation exist for such changes at the operational level. Fortunately, banks seem to be ready to make changes. Most (77%) survey respondents say they are dedicated to instilling a more robust risk culture in their organizations and feel that greater “tone at the top” and a more authoritative risk function are key to such a transformation. Banks want better information for decision making. Almost eight out of 10 respondents are seeking to improve the way that risk is measured and reported, a clear acknowledgement that previous
is at least as important as quantitative information, should help develop a more robust framework within which to make strategic decisions. Every bank faces its own set of diverse challenges, with its own unique risk appetite and philosophies.The structures will vary, but the current crisis highlights an urgent need to focus on improved enterprisewide risk management procedures. An understanding of the risk appetite by everyone in the organization, coupled with strong risk governance and robust decision-making tools, will help companies persevere as they face the challenges ahead.
Future emphasis will be on stress testing and scenario analysis, but only time will tell if such measures will be either flexible or sophisticated enough to fully capture the range of possible outcomes
The author is a thought leadership evangelist. Views and opinions are those of the author and do not necessarily represent views and opinions of IndusInd Bank. All information provided is of a general nature and is not intended to address the circumstances of any particular individual or entity. Ravish may contacted at ravish.mehta@fspro.in
The recent past has witnessed the public downfall of leaders from almost every area of endeavor— business, politics, religion, and sports. One day they’re on top of the heap, the next, the heap’s on top of them. Of course, most think that such catastrophic failure could never happen to them. Having worked hard to achieve their well-deserved positions of leadership—and they won’t give them up for anything! The bad news is: the distance between beloved leader and despised failure is shorter than what most people think. Irrespective of what people think, leaders, like all of us are human. It just that they have polished certain skillsets that make them better in implementing or getting work implemented than most of us. But they too can slip up. Slip-ups that can make them lose their leadership positioning and status. A case in point is the recent fall of Tiger Woods, the global golf legend. It makes good sense to heed to warning signals that occur in all cases of a person losing their leadership qualities. ABOUT THE AUTHOR
Ravish Mehta works for IndusInd bank and is stauch beliver on the need of sharing knowledge to evangalize and deliver best preactices for the professional. He can be reached on ravish.mehta@indusind.com
F
ortunes of global companies have sufferred in the recent past. Poor economies, bear markets have contributed to the fall in profits, but is that all? Leaders and their leadership impact have to assume some responsibility for the downslide. There are many reasons why a leader fails; some of them might be out of the leader’s control, others due to the leaders themselves. The reasons differ at each stage of the career. When starting out, the principle reason for failure is that of not creating followers. There is a difference between managing and leading. It is not necessarily the expertise that the team is looking for – the team members have expertise of their own. What the team really wants from their leader is ‘attention’. They want their work to accomplish something and they want to be recognised for accomplishing it. They will stay, grow and contribute to an organisation when they are valued, and the leader is the prime giver of that value. The next level of failure is of a ‘Charismatic’ leader. The one who makes everyone feel like they matter BUT only if it matters to self. This is a leader with a vision who is out to accomplish something big – but only for self. A leader’s vision has to come from his/her values. The leader fails when he/she mistakes egos for values. The leader should share the values with everyone and anyone to make it happen and is not worried about taking all the credit or getting rich. Finally a few leaders fail because they cannot see their own value. They prefer attributing success to plain good luck or some other external factor. They are comfortable in their zone. They tend to be weighed down by the expectations. They do not believe they are qualified to take up more significant issues. They do not want to give in more of themselves to larger challenges. The competencies which help the leaders succeed or fail differ from those required for other levels. A lot of leaders fail because they lack a clear vision, fail to build teams or do not execute well. Failure to communicate with the stakeholders and poor interpersonal skills also contribute to the failure. Just for the sake of a discussion, let us group professionals in an organisation in 4 major categories - “C” level (senior management), middle management, front line professionals and human resource professionals. Every strategy needs carefully drafted plans to be carried out at all levels. Execution on strategies only happens with clear communication and team co-operation across the organisation. The leaders need to create a shared mindset in their organizations. They can only do this by having a clear vision and strategies and communicating these in a way that gains stakeholder confidence and support. An important thing for the leaders (“C” Level) to know is - Nothing happens unless middle management and the front line have been communicated the strategy, plan and the final outcome required. These are the people who translate strategies into action.They are key to successful execution. Leaders on the wane usually display any or more of: Lowered Energy & Enthusiasm: See new initiatives as a burden, rarely volunteer, and fear being overwhelmed. Acceptance of own mediocre performance: Overstate difficulties of reaching targets, so that they look good when they achieve them. They live by the mantra “Under Promise and Over Deliver.” Stick to familiar trails:They believe their only job is to execute. Like a hiker who sticks close to the trail, they’re fine until they come to a fork. Individual decisions: They make decisions that colleagues and subordinates msotly consider to be not in the organization’s best interests and are mostly proved right. Don’t collaborate: They avoid peers, act independently, and view other leaders as competitors. As a result, they are set
Leadership adrift by the very people whose insights and support they need. Walk the Talk: They preach and set standards of behavior or expectations of performance and then violate them which is then perceived as lacking integrity. Resist Creativity: They reject suggestions from subordinates and peers. Good ideas aren’t implemented, creativity gets snub out the and the organization gets stuck. Don’t learn from mistakes: They may make no more mistakes than their peers, but they fail to use setbacks as opportunities for improvement, hiding their errors and brooding about them instead. Interpersonal skills: They make sins of both commission (they’re abrasive and bullying) and omission (they’re aloof, unavailable, and reluctant to praise). Develop the secondline: They focus on themselves to the exclusion of developing subordinates, causing individuals and teams to disengage.Leaders in business need to - hire, fire, motivate, inspire, coach and drive forresults. Everytime I think of clear and effective leadership, I am drawn to the particular sequence in the movie ‘Gladiator’ (Russel Crowe starrer), where the ‘Spaniard’ gears all his partners to fight the Barbarians for the first time in the Colosseum. He asks who has served in army and choses his second string of leaders for quick co-ordination. He then follows the natural steps of team building of – forming, norming, storming and performing. At the end, the team performs without the leader around to tell them what needs to be done at every step. Leading is a special privilege and responsibility offered to just a few. Those who step into this role also need to step out of their small lives into larger worlds of values and vision. They need to give to others from the vast well of their lives. They need to dare to lead. Leaders are usually distinguished by their ability to “think big.” But when their focus shifts, they suddenly start thinking small. They micro manage, they get caught up in details better left to others, they become consumed with the trivial and unimportant. And to make matters worse, this tendency can be exacerbated by an inclination toward perfectionism. A more subtle leadership derailer is an obsession with “doing” rather than “becoming.” The good work of leadership is usually a result of who the leader is. What the leader does then flows naturally from inner vision and character. It is possible for a leader to become too action oriented and, in the process, lose touch with the more important development of self. What is your primary focus right now? If you can’t write it on the back of your business card, then it’s a sure bet that your leadership is suffering from a lack of clarity. Take the time necessary to get your focus back on what’s important. Further, would you describe your thinking as expansive or contractive? Of course, you always should be willing to do whatever it takes to get the job done, but try never to take on what others can do as well as you. In short, make sure that your focus is on leading rather than doing. To make sure that you stay on the track of following your first love, frequently ask yourself these three questions: Why did I initially assume leadership? Have those reasons changed? Do I still want to lead? Heed the Signs: The warning signs in life—from stop lights to prescription labels—are there for our good. They protect us from disaster, and we would be foolish to ignore them. As you consider the six warning signs of leadership failure, don’t be afraid to take an honest look at yourself. If any of the warnings ring true, take action today! The good news is: by paying attention to these signs and heeding their warnings, you can avoid disaster and sustain the kind of leadership that is healthy and fulfilling for all.
Management
O
rganizations are run by people who make processes that help make progress and work more consistent. These processes are made by learning’s of people who operate in that environment. One key parameter that drives an organization is accountability and governance. How rapid and effective the decision making and responsibility allocation works People play a critical role in every success, every mishap, every opportunity seized or missed is the result of a decision that someone made
or failed to make. At many companies, decisions routinely get stuck inside the cogs of an organization. These cogs are gear’s that delivers the performance of the entire organization that run organizations smoothly and that is what is at stake. It does not matter what industry, how big and well known the company may be, or what clever strategy is thought to be implemented. If the organization can not make quick right effective decisions, executed to the T, business will lose critical ground. Making and implementing good decisions
are the hallmarks of high-performing organizations. During an online organizational effectiveness survey across 350 companies, only 12.3% said that they have an organization that helps the business outperform competitors. What sets those top performers apart is the quality, speed, and execution of their decision making. The most effective organizations score well on the major strategic decisions – which markets to enter or exit, which businesses to buy or sell, where to allocate capital and talent. But they truly shine when it comes to the critical operating decisions requiring consistency and speed – how to drive product innovation, the best way to position brands, how to manage channel partners. Even in companies respected for their decisiveness, however, there can be ambiguity over who is accountable for which decisions. As a result, the entire decision-making process can stall, usually at one of four bottlenecks: global versus local, center versus business unit, function versus function, and inside versus outside partners. The first of these bottlenecks, global versus local decision making, can occur in nearly every major business process and function. Decisions about brand building and product development frequently get snared here, when companies wrestle over how much authority local businesses should have to tailor products for their markets. Marketing is another classic global versus local issue – should local markets have the power to determine pricing and advertising?
Management
Clearing bottlenecks The most important step in unclogging decision making bottlenecks is assigning clear roles and responsibilities. Good decision makers recognize which decisions really matter to performance. They think through who should recommend a particular path, who needs to agree, who should have input, who has ultimate responsibility for making the decision, and who is accountable for followthrough. They make the process routine. The result: better coordination and quicker response times. Companies have devised a number of methods to clarify decision roles and assign responsibilities. An approach called RAPID has been used, which has been evolved over year’s, to help hundreds of companies develop clear decision making guidelines. It is, for sure, not a panacea (an indecisive decision maker, for example, can ruin any good system), but it’s an important start. The letters in RAPID stand for the primary roles in any decision-making process, although these roles are not performed exactly in this order: recommend,agree, perform, input, and decide – the “D.” (See “A Decision-Making Primer.”) The people who recommend a course of action are responsible for making a proposal or offering alternatives. They need data and analysis to support their recommendations, as well as common sense about what’s reasonable, practical, and effective. The people who agree to a recommendation are those who need to sign off on it before it can move forward. If they veto a proposal, they must either work with the recommender to come up with an alternative or elevate the issue to the person with the D. For
The mantra for smart organizations are not always being lean in size, but lean and more decisive – that implement plans and strategy more quickly – It helps to know power bottle-necks...and also who is empowered to break through them. The second bottleneck, center versus business unit decision making, tends to afflict parent companies and their subsidiaries. Business units are on the front line, close to the customer; the center sees the big picture, sets broad goals, and keeps the organization focused on winning. Where should the decision-making power lie? Should a major capital investment, for example, depend on the approval of the business unit that will own it, or should headquarters make the final call? Function versus function decision making is perhaps the most common bottleneck. Every manufacturer, for instance, faces a balancing act between product development and marketing during the design of a new product. Who should decide what? Cross-functional decisions too often result in ineffective compromise solutions, which frequently need to be revisited because the right people were not involved at the outset. The fourth decision-making bottleneck, inside v/s outside partners, has become familiar with the rise of outsourcing, joint ventures, strategic alliances, and franchising. In such arrangements, companies need to be absolutely clear about which decisions can be owned by the external partner (usually those about the execution of strategy) and which must continue to be made internally (decisions about the strategy itself). In the case of outsourcing, for instance, brand-name apparel and footwear marketers once assumed that overseas suppliers could be responsible for decisions about plant employees’ wages & work conditions. Big Mistake!
decision making to function smoothly, only a few people should have such veto power. They may be executives responsible for legal or regulatory compliance or the heads of units whose operations will be significantly affected by the decision. People with input responsibilities are consulted about the recommendation. Their role is to provide the relevant facts that are the basis of any good decision: How practical is the proposal? Can manufacturing accommodate the design change? Where there’s dissent or contrasting views, it’s important to get these people to the table at the right time. The recommender has no obligation to act on the input he or she receives but is expected to take it into account – particularly since the people who provide input are generally among those who must implement A Decision-Making Primer Good decision making depends on assigning clear and specific roles. This sounds simple enough, but many companies struggle to make decisions because lots of people feel accountable – or no one does. RAPID and other tools used to analyze decision making give senior management teams a method for assigning roles and involving the relevant people. The key is to be clear who has input, who gets to decide, and who gets it done. The five letters in RAPID correspond to the five critical decisionmaking roles: recommend, agree, perform, input, and decide. As you’ll see, the roles are not carried out lockstep in this order – we took some liberties for the sake of
Management THE D’FACTOR creating a useful acronym, a decision. Consensus is a worthy goal, but as a decision making standard, it can be an obstacle to action or a recipe for lowest-common-denominator compromise more practical objective is to get everyone involved to buy in to the decision. Eventually, one person will decide. The decision maker is the single point of accountability who must bring the decision to closure and commit the organization to act on it. To be strong and effective, the person with the D needs good business judgment, a grasp of the relevant trade-off’s, a bias for action, and a keen awareness of the organization that will execute the decision. The final role in the process involves the people who will perform the decision. They see to it that the decision is implemented promptly and effectively. It’s a crucial role. Very often, a good decision executed quickly beats a brilliant decision implemented slowly or poorly. RAPID can be used to help redesign the way an organization works or to target a single bottleneck.
A DECISION DIAGNOSTIC
Consider the last three meaningful decisions you’ve been involved in and ask yourself the following questions.
1 Were the decisions right? 2 Made with appropriate speed? 3 Executed well? 4 Right people, in the right way involved? 5 Was it clear for each decision • who would recommend a solution? • who would provide input? • who had the final say? • who would be responsible for following through?
6 Were decision roles, process, and time frame respected? 7 Were decisions based on facts? 8 To the extent that there were
divergent facts or opinions, was it clear who had the D?
9 10
Were decision makers at appropriate levels in the company? Did the organization’s measures and incentives encourage the people involved to make the right decisions?
Some companies use the approach for the top ten to 20 decisions, or just for the CEO and his or her direct reports. Other companies use it throughout the organization – to improve customer service by clarifying decision roles on the front line, for instance. When people see an effective process for making decisions, they spread the word. For example, after senior managers at a major retailer used RAPID to sort out a particularly thorny set of corporate decisions, they promptly built the process into their own functional organizations. To see the process in action, look at the way four companies have worked through their decisionmaking bottlenecks. Global Versus Local Every major company today operates in global markets, buying raw materials in one place, shipping them somewhere else, and selling finished products all over the world. Most are trying simultaneously to build local presence and expertise, and to achieve economies of scale. Decision making in this environment is far from straightforward. Frequently, decisions cut across the boundaries between global and local managers, and sometimes across a regional layer in between: What investments will streamline the supply chain? How far should we go in standardizing products or tailoring them for local markets? The trick in decision making is to avoid becoming either mindlessly global or hopelessly local. If decisionmaking authority tilts too far toward global executives, local customers’ preferences can easily be overlooked, undermining the efficiency and agility of local operations. But with too much local authority, a company is likely to miss out on crucial economies of scale or opportunities with global clients. To strike the right balance, a company must recognize its most important sources of value and make sure that decision roles line up with them. The trick in decision making is to avoid becoming either mindlessly global or hopelessly local. Center Versus Business Unit The first rule for making good decisions is to involve the right people at the right level of the organization. For many companies, a balancing act takes place between executives at the center and managers in the business units. If too many decisions flow to the center, decision making can grind to a halt. The problem is different but no less critical if the decisions that are elevated to senior executives are the wrong ones. Companies often grow into this type of problem. In small and midsize organizations, a single management team – sometimes a single leader
The defining characteristic of highperforming organizations is their ability to make good decisions and to make them happen quickly. The companies that succeed tend to follow a few clear principles. Some decisions matter more than others. The decisions that are crucial to building value in the business are the ones that matter most. Some of them will be the big strategic decisions, but just as important are the critical operating decisions that drive the business day to day and are vital to effective execution. ction is the goal. Good decision making does not end with a decision; A it ends with implementation. The objective should not be consensus, which often becomes an obstacle to action, but a buy-in.
mbiguity is the enemy. Clear accountability is essential: Who A contributes input, who makes the
decision, and who carries it out? Without clarity, gridlock and delay are the most likely outcomes. Clarity doesn’t necessarily mean concentrating authority in a few people; it means defining who has responsibility to make decisions, who has input, and who is charged with putting them into action.
peed and adaptability are crucial. A company that makes good decisions S quickly has a higher metabolism, which allows it to act on opportunities and overcome obstacles. The best decision makers create an environment where people can come together quickly and efficiently to make the most important decisions.
ecision roles trump the D organizational chart. No decisionmaking structure will be perfect for
every decision. The key is to involve the right people at the right level in the right part of the organization at the right time.
well-aligned organization reinforces roles. Clear decision roles are A critical, but they are not enough. If an
organization does not reinforce the right approach to decision making through its measures and incentives, information flows, and culture, the behavior won’t become routine.
racticing beats preaching. Involve the people who will live with the new P decision roles in designing them. The very process of thinking about new
decision behaviors motivates people to adopt them
– effectively handles every major decision. As a company grows and its operations become more complex, however, senior executives can no longer master the details required to make decisions in every business. A change in management style, often triggered by the arrival of a new CEO, can create similar tensions. Function Versus Function Decisions that cut across functions are some of the most important a company faces. Indeed, cross functional collaboration has become an axiom of business, essential for arriving at the best answers for the company and its customers. But fluid decision making across functional teams remains a constant challenge, even for companies known for doing it well. For instance, a team that thinks it’s more efficient to make a decision without consulting
a vital – but perishable – opportunity to establish a leading position, a promising drug. Competitors were working on the same class of drug, so the company needed to move quickly. This meant expanding production capacity by building a new plant. The decision, by any standard, was a complex one. Once approved by regulators, the facility would be the biggest biotech plant in the world – and the largest capital investment the company had ever undertaken. Yet peak demand for the drug was not easy to determine. What’s more, the company planned to market the product in partnership. In its deliberations about the plant, therefore, needed to factor in the requirements of building up its technical expertise, technology transfer issues, and an uncertain competitive environment. Input on the decision filtered up
Management fiefdoms and costly indecision. The theme here is a lack of clarity about who has the D. For example, at a global auto manufacturer that was missing its milestones for rolling out new models – and was paying the price in falling sales – it turned out that marketers and product developers were confused about which function was responsible for making decisions about standard features and color ranges for new models. When we asked the marketing team who had the D about which features should be standard, 83% said the marketers did. When we posed the same question to product developers, 64% said the responsibility rested with them. (See exhibit “Decision- Making Bottleneck.”)The practical difficulty of connecting functions through smooth decision making crops up frequently
A Decision Making Bottleneck
At one organization studied, marketers and product developers were confused about who was responsible for making decisions.
When asked, “Who has the right to decide which features will be standard?” 64% developers said, 83% of marketers said, “We do.” When asked, “Who
has the right to decide which colors will be offered?” 77% developers said, 61% of
marketers said, “We do.” Not surprisingly, decisions were delayed.
other When his successor began seeking consensus on important issues, the team was suddenly unsure of its role, and many decisions stalled. It’s a common scenario, yet most management teams and boards of directors don’t specify how decisionmaking authority should change as the company does. A growth opportunity highlighted that issue for a pharmaceutical company. Through organic growth, acquisitions, and partnerships, The pharmaceutical division had developed three sizable businesses: biotech, vaccines, and traditional pharmaceutical products. Even though each business had its own market dynamics, operating requirements, and research focus, most important decisions were pushed up to one group of senior executives. The problem crystallized when managers in the biotech business saw
slowly through a gauze of overlapping committees, leaving senior executives hungry for a more detailed grasp of the issues. Given the narrow window of opportunity, the company acted quickly, moving from a first look at the project to implementation in six months. But in the midst of this process, executives saw the larger issue: The company needed a system that would push more decisions down to the business units, where operational knowledge was greatest, and elevate functions may wind up missing out on relevant input or being overruled by another team that believes –rightly or wrongly–it should have been included in the process. Many of the most important cross-functional decisions are, by their very nature, the most difficult to orchestrate, and that can string out the process and lead to sparring between
at retailers. There are businesses that pioneer in employee ownership. The USP here is the strong connection between managers and employees that permeates every aspect of the organizations operations and remained vital to the company as it grew into the largest employee-owned business, with lacs of employee partners and billions in assets. Even here, however, with its heritage of cooperation and teamwork, cross-functional decision making can be hard to sustain. The element of scale is one reason why cross-functional bottlenecks are not easy to unclog. Different functions have different incentives and goals, which are often in conflict. When it comes down to a struggle between two functions, there may be good reasons to locate the D in either place – buying or selling, marketing or product development. Here, as elsewhere, someone needs
Management THE D’FACTOR to think objectively about where value is created and assign decision roles accordingly. Eliminating cross-functional bottlenecks actually has less to do with shifting decision-making responsibilities between departments and more to do with ensuring that the people with relevant information are allowed to share it. The decision maker is important, of course, but more important is designing a system that aligns decision making and makes it routine. Inside Versus Outside Partners Decision making is hard enough. Trying to make decisions between separate organizations on different continents adds layers of complexity that can scuttle the best strategy. Companies that outsource capabilities in pursuit of cost and quality advantages face this very challenge.
M a
any of the most important cross-functional decisions
re, by their very nature, the most dif ficult to orchestrate
Which decisions should be made internally?Which can be delegated to outsourcing partners? These questions are also relevant for strategic partners – a global bank working with an IT contractor on a systems development project, for example, or a media company that acquires content from a studio – and for companies conducting part of their business through franchisees. There is no right answer to who should have the power to decide what. But the wrong approach is to assume that contractual arrangements can provide the answer. If managers suddenly realize that they’re spending less time sitting through meetings wondering why they are there, that’s a signal that companies have become better at making decisions. When meetings start with a common understanding about who is responsible for providing valuable input and who has the D, an organization’s decision-making metabolism will get a boost. No single lever turns a decisionchallenged organization into a decision-driven one, of course, and no blueprint can provide for all the
KEY DATA POINTS Recommend>>People in this role are responsible for making a proposal, gathering input, and providing the right data and analysis to make a sensible decision in a timely fashion. While developing a proposal, recommenders consult with people who provide input, not just hearing and incorporating their views, but also building buy-in along the way.Recommenders must have analytical skills, common sense, and organizational smarts.
Agree>>Individuals in this role have veto power – yes or no – over the recommendation. Exercising the veto triggers a debate between themselves and recommenders, which should lead to a modified proposal. If that takes too long, or if two parties simply can’t agree, they can and must escalate.
Input>>These people are consulted on the decision. Because people who provide input are typically involved in implementation and recommenders have a strong interest in taking their advice seriously. No input is binding, but this shouldn’t undermine its importance. If right people are not involved and motivated, the decision is far more likely to falter during execution.
Decide>>The person with the D is the formal decision maker. He/she is ultimately accountable for the decision, for better or worse, and has the authority to resolve any impasse in the decision-making process and commit the organization to action.
Perform>>Once a decision is made, a person or group of people will be responsible for executing it. In some instances, the people responsible for implementing a decision are the same people who recommended it. Writing down roles and assigning accountability are essential steps, but good decision making also requires the right process. Too many rules can cause the process to collapse under its own weight. The most effective process is grounded in specifics but simple enough to adapt, if necessary. When the process gets slowed down, the problem can often be traced back to one of three trouble spots. First, There is a lack of clarity about who has the D. If more than one person think they have it for a particular decision, that decision will get caught up in a tug-of-war. The flip side can be equally damaging: No one is accountable for crucial decisions, and business suffers.
Second, A proliferation of people who have veto power can make life tough
for recommenders. If a company has too many people in the “agree” role, it usually means that decisions are not pushed down far enough in the organization. Third, If there are a lot of people giving input, it’s a signal that at least some of them aren’t making a meaningful contribution. contingencies and business shifts a company is bound to encounter. The most successful companies use simple tools that help them recognize potential bottlenecks and think through decision roles and responsibilities with each change in the business environment. That’s difficult to do–and even more difficult for competitors to copy. But by taking some very practical steps, any company can become more effective, beginning with its next decision. This decision making authority needs to be defined very clearly by organizations
wanting to improve performance. It is clearly seen that as organizations grow, the process set in place in rarely questioned for fear of upsetting the applecart, unless a visionary leader willing to risk their neck and career steps in and makes those D’s. And unfortunately, the D’s made will only qualify itself based on not only internal circumstances, but on the way the economy and markets move. The only way this risk can be minimized is by studying business cycles that are historical and can give quite an
T
he branch of the financial services organizations has time and again proved to be a show of strength for the industry, having build relations that are almost habit forming. The personal touch of the branch offers solace to the investor that the bank branch that they have been going to for multiple years and is the place they will go for trustworthy advise. A point that banks need to leverage is that the branch, as the modes of transactions rapidly change, from physical banking, to ATM’s to online and now mobile based which is reducing the human touch in transactions. Indeed, inspite of all these developments, the bank branch has survived - banks are now looking at getting customers back to the branch. Facing challenges There is no question that our organizations operate in a tough economic environment. Although the
global economy is technically no longer in a recession, tepid loan demand and continued asset quality problems have placed extraordinary pressure on capital and earnings. Higher interest charges and other complaince restrictions will significantly reduce one of the industry’s most reliable revenue sources, straining its profitability forcing many finance service providers to take a closer, harder look at underperforming branches. Added to this is the change in how many consumers access their banks. After nearly 30 years, remote access services is rapidly spreading and competing with traditional branches for transactions and customer loyalty. Whether it’s paying bills online from a laptop computer in a local coffee shop with internet access or transferring funds from one account to another via a mobile phone or while waiting in line at the supermarket, alternative distribution channels are gaining critical mass. And that in turn is forcing the branch to
evolve from a quasi-public utility (think “post office”) to the central platform in a highly focused relationship management strategy—which is a game most banks still haven’t learned how to play. But learn they must, because brick and mortar is by far the branch remains the most expensive retail distribution channel, and in a challenging economic environment, any branch that is not solidly profitable is a likely candidate for closure. Branches are being considered as profit centre and are asking themselves, if they are getting enough out of the branch to justify costs? The branch retains this foundational role because when most consumers open a new checking or savings account by choice, they strongly prefer to do that transaction in person. The branch is also important to many local businesses that make daily cash deposits. And of course, banking is still a people business and the nearby branch remains the one place where customers
consult on financial issues continuing to be the anchor for most consumer and small-business financial relationships. In all places, it is the personified image of the entity that people trust their money with. In the consumer world, the branch is a tangible entity that projects a very intangible concept—the institution’s brand identity—into the community where it can be experienced in a very visceral way. Some banks use their branches as a billboard for differentiation in the marketplace by building something that goes beyond banking and getting to be part the community. Running a profitable branch system in today’s challenging economic environment requires a clear strategy and strong execution. The bank keeps an active list of underperforming branches and is not reluctant to close those where the problem stems from a poor location. What needs to be done is to prune the branch and set up one in a better location. Some banks have the branch with its own balance sheet and profit-and-loss statement, and run like a separate business. The bank has a demanding sales culture where each branch has sales goals that are tracked obsessively. The weekly sales cycle kicks off Monday morning with a bank wide sales meeting that continues through the week with various other meetings as the progress of each branch and each sales associate is tracked against goals, and concludes with a bank wide debriefing on the week end, an intense weekly management system. Every branch has certain fixed costs, which generally include the cost of the building, land, and equipment, along with personnel costs and shared overhead for such corporate activities as compliance and auditing. Bank’s have tried to lower its branch overhead in recent years by building smaller units and using inexpensive furnishings. Some banks follow a relatively simple franchise model of consumer and smallbusiness checking accounts and use those to develop a basic blocking and tackling methodology which leads to good execution, good sales, and service leading to deeper relationship with the customer. Strategy Integrating In addition to the branch, there are three other primary distribution channels in retail banking: ATMs, call centers, and internet. ATMs and call centers have been around for a long time and have contributed to a steady decline in branch traffic. The rapid expansion of the online channel in recent years has also shifted a significant volume of transactions away from the branch. And the projected strong growth rate for mobile banking will reduce branch transaction volume even further while creating new product
opportunities for banks. Some banks have a person-to-person payment service in pilot now where customers can access through their mobile phones. The pace of technology spending to support various alternative distribution channels has slowed in recent years as the industry’s profitability has declined. And yet spending to these channels will, most likely increase as the economy recovers. But the proliferation of distribution channels has a downside as well, since it becomes expensive for banks to offer all of these options unless they are able to recapture some of the associated capital expenditures and operating expenses by lowering their branch costs accordingly. Yet that doesn’t mean they can initiate large-scale branch closures because even many heavy Internet users still want the security of having easy access to a nearby branch. In other words, an older customer may only use the branch for transactions, while another may use the branch and online channels—and a third might rely mostly on the online and mobile channels but may want access to a branch. Making the economics work In a perfect world, a majority of customers would only use the branch for high-value transactions. Low-value transactions would be handled through less-expensive channels that lower itemprocessing costs. But the bank needs to provide all channels to be competitive, but the trick is to get the customer to use the right channel. Banks need a migration strategy that encourages preferred behavior, but does not mandate it. The best source of new potential online users is the branch, which only makes sense since the branch is where most new accounts are opened. Hence the branch must get its share from the online revenue pie, a point missed by most organizations. But there is a relatively small window in which new customers can be encouraged to use other products and services like online bill payment which is not very frequent. The benefits of a successful migration strategy show up on the bank’s bottom line as the profitability of [online] customers is definitely higher than offline customers, as they are cheaper to serve and are more reluctant to leave your institution than offline customers. One of the more challenging aspects of managing a multi-channel distribution strategy is delivering the same level of consistency across the spectrum. And especially for an alternative channel like online, it has to be simple, fast and secure. But providing a consistent level of service quality requires that all the channels be consolidated. Banks have done a better job of linking the separate systems that support their consumer loan, savings, and checking account
Retail products, so that the branch executive could sit with a customer and call up a summary of the customer’s entire relationship. Many clients are looking for more cross-channel integration so that if a person started filling out a loan application online and then ran into trouble, they could come to the branch and finish the application with the assistance of a customer service representative. Continual evolution As transactions continue to shift from the branch to alternate channels, the branch will need to redefine its role. And as alternative channels siphon off an increasingly greater percentage of low-value transactions, that should free up the branch to concentrate on higher value transactions that offer greater revenue potential to the bank. This would allow branch personnel to pursue a relationship management strategy that would transform the branch into more of a consultative sales center rather than a transaction site. Sounds logical—except that most banks aren’t anywhere near ready to adopt this strategy as, traditionally, banks have sold a narrow set of products through their branch networks. What most banks haven’t done well is integrate the sales and marketing of their other core consumer products into the branch environment. Another problem is that the checking account has heretofore been the core relational product between the customer and the bank, but the steady growth of the online channel has been luring an increasingly greater number of those transactions away from the branch. This becomes an issue when an institution decides to shift its branch strategy to more of a relationship management approach because the payments vehicle—which includes the old tried-and-true checking account, a debit card linked to that account, and a credit card—drives many of the other consumer products that banks sell and helps them to understand the customer ‘s financial patterns and tendencies of the. The challenge is to use the branch as the primary vehicle in developing a deeper relationship in an environment in which transactions that are tied to the traditional product are increasingly moving offline. The transformation of the branch from a transaction to a relationship hub won’t be an easy one. It will require branch personnel to sell in more of a consultative capacity where they take the time to fully understand the customer’s needs. But if bank’s can pull off that transformation, it will breath a new lease on life for an iconic industry symbol whose obituary has been written countless times in recent years.
Intellectual capital
By Abhijit Talkukdar
IC
practitioners the world over have been pushing to prove the value of Intellectual Capital in the modern knowledge economy. There is no doubt about the importance of Intellectual Capital within the worldwide IC community. Countless management gurus and other thought leaders have explained how it is the intangible assets in an enterprise that provide lasting value and competitive advantage to the business. They have even conceptualized and published frameworks for discovering, measuring and managing the IC of firms. But what about the average man on the street? Does he understand the concept of IC as easily as he understands accounting concepts such as a balance sheets and income statements? What about the average investor? Does he bother to investigate the quantum of intangible assets within the business before making his investment decision? The answer is an emphatic NO. For all the volumes of literature produced by the IC community over the last twenty five years, the common man remains ignorant about the role of IC as a value driver of the modern knowledge economy. However, whenever I have ventured personally to explain the relevance and importance of IC to the average man directly, I have found immediate acceptance every single time. This leads to believe that the common man is perhaps not ignorant, just constrained. He is compelled to ignore because he just does not have any source that dispenses this information to him readily. This compulsion on his part results in a lack of demand for related information and consequently business leaders are only too happy to not provide the supply. Thus we have a cycle of deadlock where there is no demand because there is no supply of related information and there is no supply of such information just because there isn’t any demand. This cycle needs to be broken before IC can become relevant and topical in the business context. But how? I have attempted to do this by enabling investors to inspect the Intellectual Capital of top 50 Indian businesses (Nifty stocks) every quarter. This is a proof of concept application at the moment called the icTracker. This enables the investor to ask three fundamental questions about every firm in the Nifty list. Is the business making money? This is the most fundamental question that any investor wants to ask. Nobody wants to invest in a loss making proposition, however the fact remains that accounting profits as published by firms in their income statements do not correctly reflect whether the business is making money. What we really want to know as an investor is whether the business is making economic profits. In other words, are accounting profits enough to cover the cost of capital and then some? We use a concept known as EVA (Economic Value Added) for this purpose. EVA was popularized by Stern Stewart & Co in the early nineties. Many businesses worldwide took to this concept like fish to water, believing that this was one of the fundamental measures of success for any business. To us, it is only the first step but a very important step. We want to pick businesses that consistently have a positive and hopefully increasing EVA. In case we find that EVA is negative during some time periods, we need to probe whether it is due to genuine reasons such as large capital expenditures incurred for driving future growth. But to keep it simple, always look for consistently positive EVA.
30
Do knowledge assets dominate the business? If the answer to Q 1 is YES move on to Q 2, where we ask whether knowledge assets are dominating business assets. In other words, we want to ensure that we stay from investing in businesses that are dominated by physical or financial assets. Why? Just because we want to invest in businesses that have a high degree of competitive advantage. It has been proved repeatedly that competitive advantage is derived from knowledge assets rather than physical or financial assets. Investing gurus such as Warren Buffet have gone to the extent of coining their own word for such advantage – MOAT. Buffet likes to invest in businesses that have a high degree of economic moat. And we should too. In the IC world, measuring Knowledge Basis – the ratio of the Intellectual Capital to the Total Value of the firm – gives us an idea in percentage terms whether knowledge assets dominate the business. Here we should look for a ratio of more than 50% consistently over a period of time, which indicates that the business has a high degree of competitive advantage and is also able to maintain the same over the said period of time. Is the stock undervalued? If we got this far, then clearly we have been able to identify a business which consistently makes money and has a consistently high proportion of knowledge assets. With these two simple questions, we have been able to zero in on an attractive investment proposition. All that is left to ask now is whether the timing is right for making the investment. For this we check whether the stock is undervalued. How? The Intellectual Capital of the business is already computed based on published results. Book assets are available from published balance sheets. Add both of them up and divide by the number of shares to get the intrinsic stock value. Compare this to the market price, which is also available and we know whether the stock is undervalued and hence the right investing time. That’s it. In just 3 easy steps, we will be able to identify a good, long term investment. Abhijit Talkukdar is principal consultant at Attainix Consulting. He can be reached on email at abhijit.talkukdar@attainix.com
Mobile
Now that power of payments world is on the verge of making the next obvious stop at one of the most widely used electronic innovations in the world - namely mobiles. For those who have seen this development, especially the card evolution, its complexities, and the time it took for the environment to reach its current maturity level, it comes as no surprise that the financial world is taking its sweet time to give its stamp of approval to this modern style of paying for goods and services.
F
rom humble beginning every service and commodity since the begining of civilization has used barter as a mens of exchange of value. Starting from simple exchange of commodities for services or other goods not availble with the person who needs it, it grew in modes starting through precious metals, coins, paper, plastic and lastly electronically. Electronic payments were used by banks transferring funds among its selves. Now that power of payments world is on the verge of making the next obvious stop at one of the most widely used electronic innovations in the world - namely mobiles. For those who have seen this development, especially the card evolution, its complexities, and the time it took for the environment to reach its current maturity level, it comes as no surprise that the financial world
is taking its sweet time to give its stamp of approval to this modern style of paying for goods and services. To make matters worse, there are multiple ways in which this can be achieved, and in the end, only one or a few models may survive, unless interoperability standards and protocols emerge. So, what is the story behind Mobile Payments all about? Let’s explore some of its major aspects. Business Models Mobile Payment solutions can be implemented in more ways than one. The differences mainly deal with the number and type of players involved in the transaction and settlement cycles, and how they are connected. On this basis, following are some of the currently popular models, and new models are emerging regularly.
• Bank Centric: In this model, customer informs his bank via his mobile based application to debit his account, and transfer the funds to the beneficiary. The inter-bank communication happens via existing payment gateways. The solution includes appropriate checks in the form of passwords, encryption, etc. • Application Service Provider Centric: This is similar to the above. However, the interbank fund transfers / settlement happen via the service provider. This creates a closed user group consisting of banks connected to the mobile payment service provider. •Telecom Service Provider Centric: Here, the customer can inform their service provider to transfer funds to the beneficiary. The transferor and transferee pay / receive the funds
MOBILE PAYMENTS
1
Mobile banking investments have very high ROIs – into the multiple hundreds of percent – when largely successful (by way of customer impact, cost, adoption.) There is a fall-off in the performance of less successful implementations, which potentially have negative ROIs when adoption is very low.
2
Measurement matters – the most advanced banks in mobile generally had the best sense of mobile’s effect on customer behavior: • Measuring outcomes is the key to the development of impactful future initiatives; • Channel economics are important, but understanding the economic impact on the entire customer relationship is critical; • Measurement enables stake holdering, which avoids the risk of mobile being seen as cost-only or generating pressure to generate revenues from mobile banking fees (which ultimately stifle usage.)
3
Minimizing fees drive greater engagement from customers, which will be critical for future opportunity capture: • Cross-selling of both financial and non-financial products; • Reaching the under banked, the next generation of mobile users, who otherwise may not ever become bank customers; • Orchestrative lifestyle management – preferences, content, information.
4
Banks can be successful being technologically strong, or functionally strong – but the pacesetters are both. Leaders continually monitor and leverage handset functionality evolution.
5
The mobile channel is still early in its maturity, but must be recognized as an integral delivery channel, with dedicated, passionate support structure.
6
Staff engagement widely cited as critical, and a far better investment than in external marketing.
Overview:
This hallowed and serviceoriented financial institution relies on Good for Enterprise to protect bank data on the mobile devices its employees prefer-while delivering the security and management the company requires.
The Challenge:
Union Bank needed to find a secure mobility solution that supports the popular devices employees want to use, such as the iPhone, iPad, and Android-based handhelds.
The Solution:
Good for Enterprise is a powerful, easy-to-use enterprise mobility suite that secures bank data on mobile devices. It gives IT the safety and control it needs, while giving employees a great mobile collaboration experience on the devices they want.
The Result:
Union Bank can implement an individual-liable model, which basically allows employees to use their own devices for work-at their costand gives IT control of bank data, both over-the-air and on the devices.
via the billing mechanism in case of post-paid accounts, or via already purchased talk time in case of pre-paid accounts. • e-Wallet model: Here, the customer’s and beneficiary’s mobiles act as temporary stores of value. This is akin to the customer going to the bank, withdrawing cash, and handing over to the beneficiary, who in turn, goes to his bank and deposits it in his account. In this model, instead of the traditional cash, e-cash is transferred between mobiles. Of course, this model has to implement the necessary security frameworks. Each of these business models has its own pros and cons, and its own service charge/commission sharing modalities. Technology Models On the connectivity / technology front, these business models can be implemented in various ways. Following are the currently popular approaches. • GPRS: Here, mobile handset based application connects to an application server using GPRS connectivity. It passes details of the customer initiated transaction to a server located in his bank’/application service provider’s data center, which follows up with the subsequent transaction processing. Confirmation/intimation/rejection/etc. also happens between the server and mobile using GPRS connectivity. • SMS: Here, transaction details are communicated from customer’s mobile to server using SMS. On receipt of transaction, server acts similar to the above model, except that return confirmations/cancellation/etc. communication also happens via SMS. •USSD: Under this model, customer’s handset communicates with telecom service provider’s server using special commands. The telecom server communicates this information to the application server, which follows up with banks/payment gateways/etc. for transaction processing. •NFC: Here, a customer’s mobile handset communicates using Near Field Communication technology to similar devices at merchant establishments, which in turn, pass this information via the existing POS/ similar networks to existing acquirer bank/payment gateways/etc. From there onwards, transaction processing happens in the traditional way. Each of these technology models has its own costs, which can be borne/ shared by customer/merchant/tele-com service provider/application service provider/bank/etc., besides its own convenience/security/speed/availability/ etc. levels.
Chetan Dighe is a VJTian from Mumbai. He has worked in the finance vertical both in India and abroad with extensive hands-on experience in payments. Email chetan@moment.com
B
OYD started off innocently when, organizations started honouring their stars with the latest gizmos to pamper them and/or get them to do more work. There are a lot of executives who spend entire lives being connected. This lead to several head honchos and C-level function heads, their second lines and soon eventually getting their own devices, which has been conveniently let through by the physical security parameter as these defaulters are people whom security personal dare not question. The issue here is not of the intend of senior professionals, but the fact that, malware and such suspicious objects can creep past the security cordon, without the hosts realizing. The main concern facing the CIOSs are as to where they should get started? Juniors can be forced to comply but what about the boss? For the past several years, infrastructure and operations (I&O) organizations have pushed for standardized, lockeddown corporate PCs in order to allow as little variation as possible. They want few surprises and even fewer support calls. While this approach might keep IT operations costs lower, it brings an unintended consequence of stagnation from the worker’s point of view. Several employees invest and use their own personal computer or smart phones to assist them in their job functions and the number is growing by the day including the wanna-be’s or employees wanting to be considered for growth. Given the competitive advantages that empowered workers may bring, and risks associated with unknown behaviors, embracing empowered workers and unlocking bring-yourown-device (BYOD) programs are more important now than ever. That doesn’t mean embracing anarchy; rather, changing mindsets, from one of prohibition to one of channeling & enablement, setting you apart from your peers. Setting up processes for BOYD One of the critical parameters is embodied in the form of risk management, i.e evaluating and understanding who among your people need BOYD. Door 1. Once this assessment is done then rest belong more to access control, processes and compliance. Understand work styles The technology that employees use for their jobs should be a function of their work styles. However, it’s true that many I&O professionals have a better understanding of technology and internal processes than they do the nuances of employee work styles and productivity drivers. It will take a concerted effort and a formal initiative to shift this imbalance toward greater work-style knowledge. I&O pros should be able to answer questions such as: 1. Who works more away from a desk? 2. Who are willing to buy their own devices for work? 3. Who would be happy with a locked-down computer? 4. Who use’s advanced collaboration tools? 5. Who pose the most information risk?
6. Who would lose significant productivity with hosted virtual desktops? Not every employee is a candidate for BYOD. To develop a meaningful understanding of who will benefit from BYOD initiatives, I&O pros need an analysis of employee work styles and an exercise to group similar work styles into a handful of personas. These are workforce technology needs assessment methodologies to help clients perform these assessments and get to know their workforce. In addition to creating productivity enhancements and costs savings, these assessments illuminate which workers will benefit from BYOD programs, and they help I&O pros form a rational basis for deciding for whom BYOD is not appropriate. It should take no longer than 30 days to assemble a sufficiently improved understanding of work styles to feed the next steps in the process. Learn (available tools) The solution to variation for BYOD programs is a combination of client virtualization; using the correct management tools for the job; education; and matching skills. There are several methods for providing workers with a standardized Windows environment without corporate device. The most common methods are hosted virtual desktops, desktops-as-a-service, and locally deployed and managed virtual desktops. Each carries its own benefits and drawbacks - research to know out how they can, or can’t, help you. Evaluate benefits Evaluating benefits of a new process hopefully to improve working conditions should be evaluated carefully before policy making. The management has to make a careful analysis of the new method ... and whenever the new method is found to be markedly superior to the old, it should be adopted as the standard for the whole establishment. This is a business paradigm that has several buyers Define access levels A natural outcome of these exercises are to develop a better understanding of who can potentially move into self support. Workers with moderate or better technical abilities, few dependencies on internal or legacy applications, and low security requirements are the best candidates for self support initially. They’ll also be the least likely to need help to remain productive with the computer of their choice. While the self-support zone may initially be small, client virtualization, community development, and self-service tools will allow you to rapidly expand the self-support zone. Client virtualization is already a proven way to improve supportability, and it works particularly well for BYOD programs when properly matched to work styles. By supplying BYOD workers with a standardized Windows environment in either a hosted or locally deployed virtual machine, you can provide a clean separation between work and personal applications and data, while also improving manageability and supportability.
By SAMEER J Ratolikar
C
rimeware is malicious software covertly installed on computers. Most crimeware programs are, in fact, Trojans. Trojans are designed to do different things. Some are used to log every key you type (keyloggers), some capture screen shots when you are using banking web sites, some download other malicious code, and others let a remote hacker access your system. What they each have in common is the ability to ‘steal’ confidential information, such as passwords and PIN’s; and send it back to the criminal with which the cyber criminal may steal your money. Crime ware is defined as any malware designed for illegal online commercial gain. Some of the web 2.0 attack vectors, which are flavors of crime ware are key loggers, web trojans, session hijackers and finally crime ware using blended threats. Let us understand crime wares and some counter measures: Keyloggers are one of the most dangerous attack vectors responsible for capturing keystrokes of a user and send it silently over to command and control centre of the hacker. They use following to record key strokes Browser Helper Objects(BHOs) may get activated upon typing specific URL (With this approach keylogger may remain dormant for some time and gets activated upon user typing specific bank’s URL).
When installed as a kernel level device driver can record keystrokes and send it via ftp/http to remote server. Keyloggers are emerging as big threat and hackers have shifted their focus from typical email based phishing to keylogging attacks to capture user’s credentials. Session Hijackers (Man-In-The Browser attack) is a kind of attack malicious software residing in user’s browser component hijacks the session. So, when initiates the transaction, his transaction details are manipulated in the realtime. Session Hijacking or MITM attack (Man-In-The-Middle attack) is when the session hijacking attack is performed remotely. This attack vector is also known as ‘bucket-brigade attack’. Here, the attacker takes a position between the client-server and intercepts communications. The contents of crimeware, pharming attack is one of the better examples of MITM attack where DNS entries of the server/host machine are changed to direct a user to malicious server. One more example of MITM using session hijacking is web proxy attack in which malicious web proxy receives all traffic from a compromised computer and relays it to a legitimate site, intercepting it. This attack when committed on the broadband router takes the shape of ‘Drive-By-Pharming’ when the router’s DNS entries are forcefully modified. Web based Trojans when installed on a machine, these programs silently waits for the user to visit a particular
website. When user visits that website, the Trojan silently places a fake window over the actual login window thereby capturing user’s credentials. Although crimeware is distributed by many ways, these are some media Email attachments/SPAM/Phishing emails usually carry some attachments. User when clicks on that attachment, malicious code gets executed in his machine. This malicious code could be a keylogger or some malware. Piggybacking is when the user unknowingly downloads some audio/video file and hidden malware gets installed on user machine. Cross site scripting (Application layer attack) is a malicious code installed on the server. The malware propagated either by email attachments, piggybacking or by application layer attacks like Cross Site Scripting (XSS) or by combination of all attacks called hybrid attacks. There are several steps you can take to protect your computer from today’s cyber threats and ensure crimeware protection. Following the simple guidelines below will help minimise the risk of attack. Protect your computer by installing Internet security software. l Install security patches for your OS and applications. l If you receive an email with an attached file (word documents, excel spreadsheets, .exe files, etc.) don’t open it unless you know who sent it and only then if you’re expecting it. l NEVER open an attachment sent in an unsolicited (spam) email or IM (Instant Messaging) that contain links. l Update your security software regularly (at least 1 a day). l Keep all other applications updated. l Use your computer’s administrator account, if you need to install software or make system changes. l For everyday use, create a separate account with only limited access rights (this can be done through ‘User Accounts’ in ‘Control Panel’). By doing this, you limit a malicious program’s access to valuable system data. l Backup data regularly on to a secure media. If your files become damaged or encrypted by a malicious program you can then copy them back, to protect against malicious code and hacker attacks: 1. Install internet security software. 2. Install security patches. 3. Be wary of unsolicited email. 4. Secure administrator rights log-in. 5. Back-up data. These are some steps, to help you, but by no means are guaranteed protection from crimeware. only one last bit of advise... Be wary, at all times!
In information security circles, 2014 has been a year of what seems like a never-ending stream of cyberthreats and data breaches, affecting retailers, banks, gaming networks, governments and more. The size, severity and complexity of cyber threats continue to increase and here are the five security trends that will dominate the year: Cyber crime: The Internet is an increasingly attractive hunting ground for criminals, activists and terrorists motivated to make money, get noticed, cause disruption or even bring down corporations and governments through online attacks. Today’s cyber criminals are highly skilled and equipped with very modern tools. Organizations must be prepared for the unpredictable so they have the resilience to withstand unforeseen, high impact events. Cyber crime, along with the increase in online causes (hacktivism), the increase in cost of compliance to deal with the uptick in regulatory requirements coupled with the relentless advances in technology against a backdrop of under investment in security departments, can all combine to cause the perfect threat storm. Organizations that identify what the business relies on most will be well placed to quantify the business case to invest in resilience, minimizing risk impact. Privacy and Regulation: Most governments have created, or are creating, regulations that impose conditions on the safeguard and use of Personally Identifiable Information (PII), with penalties for organizations that fail to sufficiently protect it. Organizations need to treat privacy as both a compliance and business risk issue, in order to reduce regulatory sanctions and business costs such as reputational damage and loss of customers due to privacy breaches. There are increasing plans for regulation around the collection, storage and use of information along with severe penalties for loss of data and breach notification particularly across the European Union, but this needs to be expanded across the world. This is expected to continue and develop further imposing an overhead in regulatory management above and beyond the security function and necessarily including legal, HR and Board level input. Organizations should look upon the global struggles with data breach regulation and privacy regulation as a gauge, learn and plan accordingly. Threats From Third-Party Providers: Supply chains are a vital component of every organization’s global business operations and the backbone of today’s global economy. However, security heads are growing more concerned about how open they are to numerous risk factors. A range of valuable and sensitive information is often shared with suppliers, and when that information is shared, direct control is lost. This leads to an increased risk of its confidentiality, integrity or availability being compromised. Even seemingly innocuous connections can be vectors for attack. The attackers who cracked Target exploited a web services application that the company’s HVAC vendor used to submit invoices. Third-party providers will continue to come under pressure from targeted attacks and are unlikely to be able to provide assurance of data confidentiality, integrity and/or availability. Organizations of all sizes need to think about the consequences of a supplier providing accidental, but harmful, access to their intellectual property, customer or employee information, commercial plans or negotiations. And this thinking should not be confined to manufacturing or distribution partners. It should embrace professional services suppliers, lawyers and accountants, all who share access
Uber Security to your most valuable data assets. Info-security specialists should work closely with those in charge of contracting for services to conduct thorough due diligence. It is imperative that organizations have robust business continuity plans in place to boost both resilience and senior management’s confidence in the functions’ abilities. A well-structured supply chain information risk assessment approach can provide a detailed, step by step approach to portion an otherwise daunting project into manageable components. This method should be information-driven, and not supplier-centric, so it is scalable and repeatable across the enterprise. BYOx Trends in the Workplace The bring-your-own (BYO) trend is here to stay whether organizations like it or not, and few organizations have developed good policy guidelines to cope. As the trend of employees bringing mobile devices, applications and cloud-based storage and access in the workplace continues to grow, businesses of all sizes are seeing information security risks being exploited at a greater rate than ever before. These risks stem from both internal and external threats including mismanagement of the device itself, external manipulation of software vulnerabilities and the deployment of poorly tested, unreliable business applications. If you determine the BYO risks are too high for your organization today, you should at least make sure to stay abreast of developments. If you decide the risks are acceptable, make sure you establish a well-structured BYOx program to minimize the risk. If implemented poorly, a personal device strategy in the workplace could face accidental disclosures due to loss of boundary between work and personal data and more business information being held and accessed in an unprotected manner on consumer devices. And realistically, expect that your users will find a way to use their own devices for work even if you have a policy against BYOx. You may stop it from coming onto one little bit of sand, but it will find a way around it. The power of the user is just too great. Engagement With Your People: And that brings us full circle to every organization’s greatest asset and most vulnerable target: people. Over the past few decades, organizations have spent millions, if not billions, of dollars on information security awareness activities. The rationale behind this approach, was to take their biggest asset - people - and change their behavior, thus reducing risk by providing them with knowledge of their responsibilities and actions. But this has been - and will continue to be - a losing proposition. Organizations need to make positive security behaviors part of the business process, transforming employees from risks into the first line of defense in the organization’s security posture. Organizations need to shift from promoting awareness of the problem to creating solutions and embedding information security behaviors that affect risk positively. The risks are real because people remain a ‘wild card.’ Many organizations recognize people as their biggest asset, yet many still fail to recognize the need to secure ‘the human element’ of information security. In essence, people should be an organization’s strongest control. Instead of simply making people aware of their information security responsibilities and how they should respond, the answer for businesses of all sizes is to embed positive information security behaviors that will result in ‘stop and think’ behavior becoming a habit and part of an organization’s information security culture. While many organizations have compliance activities which fall under the general heading of ‘security awareness,’ the real commercial driver should be risk, and how new behaviors can reduce that risk. The author is a seasoned security professional, with global experience. He is currently CISO, Axis Bank.
Process
M
any financial services companies have turned their inbound call centers into powerful engines for growth. ICICI bank had tele-callers who have since been positioned as Phone Banking Officers, thereby delivering substantial customer
new products.
Call centers, generate up to 25 % of total new revenues for some card companies and upto 60 % for some telecom companies. The top priority in these call centers is resolving service issues.
All employees, from the top leadership to frontline agents, have imbibed the brand, the proper mind-set, motivation, and skills
value, increased self worth and improving staff retention. Call centers, today, generate up to 25 % of total new revenues for some credit card companies and up to 60 % for some telecom companies. The top priority of agents in these call centers is resolving service issues. But they are also encouraged to initiate conversations to uncover customer needs, and this component of the job can lead to new products sales or upgrades of current ones. Partly for fear that unsolicited sales pitch will put off existing and new customers, other sectors, such as retail banking, have been slow to turn service calls into sales calls. Research indicates that when agents meet the service needs of customers and then ask them about their broad needs in a sincere way, customers are often receptive to try or buying
Research indicates that companies have failed to tap the full revenue potential of their call centers because they just don’t understand the extent of the opportunity. In financial services, for example, it is estimated that every five inbound service agents
could generate as much new retail business as one mature branch. Since call centers handle more than 25 % up to over 80% in some cases of all customer interactions for top banks in India, efforts to crosssell during inbound service calls could increase annual sales of new products by an amount equivalent to 10 % of the retail sales generated by a bank’s entire branch network. To transform call centers, banks—and other companies in the early stages of efforts to turn service calls into sales—must first provide consistent, high-quality customer service. Then executives must ensure that all employees, from the top leadership to frontline agents, have imbibed the brand, the proper mind-set, motivation, and skills. Untapped revenue Banks have been slow to embrace a service-tosales program, mostly because their products
(such as home equity loans) are inherently more complicated than, say, telecommunications products (call-waiting, for example). Studies on the service-to-sales performance of six large retail banks in India—among other things, by monitoring 15 bank call center agents and evaluating more
On average, bank telecalling officers or agents cross-sell less than one core product for every 100 inbound calls they handle.
than 600 service calls found a large gap between these companies and those in more mature service-to-sales industries in the motivation of agents and their skills for cross-selling. In telecommunications, for instance, agents try to explore the needs of customers in half of all service calls. The service agents of many banks do so less frequently On average, bank agents cross-sell less than one core product for every 100 inbound calls they handle. But each call center under the study had a group of top-performing agents who sold more than three core products for every 100 calls, as well as a large number of agents with no sales at all. Findings suggest that banks have a real opportunity to improve the average performance of their agents.
Process Indeed, we know of two banks (not covered in the study) that have turned their inbound customer service call centers into powerful engines for growth. Typically, these banks sell up to four core products1 for every 100 calls their service agents handle and as many as five additional relationshipbuilding products (for example, direct deposit and online banking and bill-payment services). At this rate, it is estimated that , every five inbound service agents can generate new-product sales equivalent to the sales of one mature bank branch. Making sales through call center agents does considerably bump up the demands on day-today management and operations processes. Nevertheless, launching a comprehensive serviceto-sales program typically requires lower up-front investments and expenses than opening a handful of new branches. In any case, the extra costs incurred through higher average call handle times, additional training and coaching for agents, and lower supervisor-to-agent ratios add up to less than 10 % of the new sales’ value. It is found that it is not so much a lack of investment that slows the progress of most banks—and other companies—trying to turn call centers into profit centers but rather deeply rooted mind-sets that prevent agents from achieving their sales potential. Arguments over who gets the credit for new sales, managers and agents who think that cross-selling will annoy service customers, and the perceived stigma of telephone sales combine to create a formidable change-management challenge. Improving performance Within two years of starting to implement a service-to-sales strategy, most bank call centers that already deliver high-quality service can boost their sales levels to at least three core products for
needs, let alone show interest in new products or services. Competence, confidence, and a genuine concern for the customer are obviously prerequisites to cross-selling. Yet it was found that several banks were trying to implement service-to-sales strategies without first establishing a strong service foundation. Call centers that lack one and nonetheless try to cross-sell will not only fail, but also risk damaging existing customer relationships. Financial services executives shouldn’t underestimate the difficulty of turning around poorly performing service centers. Changes to several core processes (such as scheduling, recruiting, coaching, and measurement) are usually required, and it often takes more than a year for behavioral and process initiatives to take root and for customer satisfaction to rise. e-aligning organizational goals senior managers throughout an organization may firmly believe that selling is an important part of call center service. But when this conviction breaks down within or beyond the walls of the call center, it is often for fear that overly aggressive selling alienates customers and erodes the value of cross-selling. That is a genuine risk, but it can be mitigated. The best way to address it is to measure and monitor customer experience by using post-call customer surveys that track the performance of individual agents. But for that to succeed, the agent must have resolved issues on hand in a satisfying manner. Resistance to service-to-sales conversions among executives who work outside call centers is largely political. Many of these executives firmly believe that a branch “owns” each customer and that call centers are trying to grab sales (and related commissions) that rightly belong to the branch. To
management to gather testimonials from customers who are pleased with products or services they purchased through the call center. Customer feedback about which specific circumstances of sales attempts may have annoyed them can also be a useful tool. Sometimes managers and team leaders just don’t believe that agents can learn to be great salespeople. Seemingly innocent comments such as “we try to avoid the word ‘sales’ because it intimidates agents” or “agents can be successful here without selling” indicate resistance and a lack of alignment. Worse still, it is common for supervisors and support function managers to pay lip service to the importance of service-to-sales conversion while contradicting words with actions. At most banks in our study, the agents’ supervisors spend less than a third of their time coaching agents, and fewer than a quarter of those sessions deal with ways to improve cross-selling skills. This is unfortunate, since our research suggests that the amount of coaching that agents receive is a critical element in driving service-to-sales performance One of the best ways for call center managers to set ambitious but attainable sales goals is to hold a pace-setting event, which involves working with a few teams of agents to introduce updated training, job aids, coaching, metrics, and incentives simultaneously. Success depends on the immediate measurement of results. Other needs include a constant feedback loop from the teams to the leadership about what could help them make sales and an effort to monitor both the customer experience and the agents’ morale. A successful pace-setting event gives agents and managers an objective and realistic assessment of the call center’s sales potential and helps to
Within the call centers themselves, managers and team leaders must believe that eliciting the needs of customers and recommending products are aspects of providing great service.
But they are also encouraged to initiate conversations to uncover customer needs, and this component of the job can lead to new products sales or current one upgrades.
Competence, confidence, and genuine customer concern are prerequisites to cross-selling. Yet several banks were implementing service-to-sales strategies without first establishing a strong service foundation.
every 100 calls. To succeed, bank executives must first determine whether they need to shore up the basic service functions. Once they are confident that the service needs of their customers are being met, they must convince all bank employees—from the top leadership to frontline agents—that crossselling is desirable and then provide agents with effective training, incentives, and call-routing tools to pick up the pace and boost sales.
minimize such channel conflict, one retail bank with strong service-to-sales performance designed a commission system that shares the benefits of any sale made through a call center between its agents and bankers in branches. Under this system, call center agents can book appointments with those bankers for customers, who receive a seamless service experience.
build service-to-sales support throughout an organization. An awareness of the relevant data can help to change the focus of most bank executives, the majority of whom just want to beat the previous year’s sales performance when they should actually be working toward an order-of-magnitude increase in conversion rates.
Within the call centers themselves, managers and team leaders must believe that eliciting the needs of customers and recommending products are aspects of providing great service. One technique for changing the behavior of call center agents is to study the way they feel about selling and then to address their concerns directly. If, for example, the study finds that agents (and possibly their managers) worry that attempts to cross-sell will annoy customers, one response would be for
One of the thorniest challenges in motivating agents to cross-sell is developing their skills. Over a third of the agents in the study said that they generally don’t feel comfortable with sales. An anonymous survey of more than 300 agents found, not surprisingly, that conversion rates tend to be higher at banks where more agents believe that “selling is an important way to serve customers.”
Building the service foundation call center agents must earn the right to sell by demonstrating that they have fulfilled their primary function: meeting the customer’s immediate needs. If agents don’t adequately resolve questions or problems, it is clearly inappropriate to explore other needs in hopes of making a sale. Likewise, if agents don’t show empathy while resolving service inquiries, it’s unlikely that customers will want to talk about their
Reinforcing will and skill
The key element in changing the will and skill of
Calling Revenue
through the same program so that they could coach agents and reinforce messages effectively.
routes them to agents with skills that match their needs.
the agents is clear: training and coaching agents in the art of cross-selling. The right performance metrics and incentives, as well as a fun working environment, also help to boost sales. But most bank leaders, excessively focused on processes, tend merely to check off boxes that in their view address each cross-selling requirement. The priority should be to change people, not just processes.
A survey of agents confirmed that in bank call centers, monetary incentives can act as a strong motivation to cross-sell, especially for topperforming agents. But while a good incentive plan is necessary, it is not a panacea. In the six banks studied, no correlation was found between cross-selling performance and the way such plans were structured, including the size or frequency of payouts and the number of agents receiving them.
Fortunately, many service agents can learn how to uncover customer needs and make referrals to sales specialists. As in other industries, high turnover makes some bank call centers resist bigger investments in training and coaching. But experience in such investments, such as the ICICI Bank repositioning of agents to Phone Banking Officers demonstrate a bank’s commitment to giving agents a career path, which helps to motivate them. More important, giving them the skills to succeed, particularly in cross-selling, makes their jobs less stressful and improves their performance. Both benefits combat attrition effectively. Adults learn most effectively by experimenting with new knowledge, integrating it into what they already know, and learning from peers rather than superiors. Yet if call center agents at most banks receive any formal training to cross-sell, it consists
Value Segmentation
One bank, for instance, harvests customers who don’t use branches but are likely candidates for home loans, which are worth up to ten times more to the bank than its other core products. It now makes nearly three home loan sales for every 100 customers harvested. To achieve this degree of success, top-performing agents—who are better at capitalizing on high-potential sales as well as more self-confident—should handle harvested callers, since being pulled out of an automated system can disconcert customers.
of no more than a single classroom program. At one bank, trainers used a call flow mnemonic to help service agents remember how to make the transition from service to the sales pitch but when asked what this mnemonic stood for, none of the coaches remembered. Only one agent, who had attended a training session a few weeks earlier, could recall the mnemonic’s content, but the supervisors had failed to reinforce the training, and hence there was no opportunity to practice what was learned. By contrast, a successful bank trained its agents to cross-sell in a program that unfolded in stages over several weeks. Its agents could explain the sales methodology they had learned and remembered specific tips and phrases. Further, the bank’s management sent all call center supervisors
Once a bank delivers consistently high-quality service to its customers, aligns its service-to-sales goals across the organization, and builds the crossselling skills of its agents, segmented call-routing tools can further boost sales. Managers may, for example, direct a string of similar customers—or of customers with similar problems—to a single agent. This approach allows agents to develop a sort of rhythm, which helps them to resolve problems more quickly and to make the transition to sales more smoothly. One
telecom provider, for instance, adjusted its incoming call queues to route high-value, highpotential, and at-risk customers to specially trained agents. By doing so, it improved its retention of at-risk customers and increased
satisfaction rates among highvalue customers. Banks might also boost their sales by “harvesting”—pulling a few selected customers from an interactivevoice-response (IVR) system. Up to 85 percent of all bank customers who call for service never reach agents. At times when their utilization is low, merely initiating more conversations with customers creates more opportunities for sales. Once customers identify themselves, harvesting software locates the highpotential ones, pulls them out of the system, and
Banks should resist the temptation to invest excessively in automated prompts that tell agents what products to offer. If anything, this approach encourages mechanical sales pitches that rely too much on call scripts and makes it harder for agents to pick up cues from customers. In both, the credit card and the telecom industries, it is observed that agents who listen to customers and ask real questions to ascertain their needs are much more likely to uncover sales opportunities than agents who follow automated prompts like robots. It is critical that banks not only address the entire menu of service-to-sales imperatives but also address them in the right order. Customers will not be receptive to cross-selling without first-rate call center service. Once agents deliver it
consistently, they must embrace selling with enthusiasm, and that simply won’t happen until team leaders and support managers demonstrate an unwavering commitment to service and sales. Above all, executives must realize that it is hard to encourage new mind-sets and behavior for all managers, agents, and support functions. But when every stakeholder is convinced of the potential gains—increased sales and long-term growth—banks can realize the full revenue potential of their inbound call centers.
Marketing
poitrewqGHJK:PUYTR The brand story, as we know it, has attained huge signifance in the way business is done. So there may be some questions! What is a brand? Why and how do we buid it? How can we evaluate what a brand does for us? The answers for these are hidden in a further set of questions. Answer them and you will get the answers to these queries!
Q 1. Why do people sign their names with a INR 5400 Montblanc when they can do the same with a use and throw gel or ball point for about a INR 15? Q 2.Why do they buy bottled water, when free alternatives are a mere arm’s reach away? Q 3. Why will people refuse to buy a computer unless it has an branded chip in it, a chip most probably never ever seen? The reason is that all of these companies have taken
zxcvnmlkpoitrewqGHJ
T
he development & implementation of a successful branding strategy is a must for organizations to successfully enter and compete in their marketplace for several reasons. Strong branding breaks through the clutter and confusion. Strong branding differentiates brands in a commodity market. In many markets, the delivery of health care products is a commodity. There is little differentiation in pricing, most provider systems have a broad network of providers, and people perceive that they will get better no matter which hospital they go into. Providers have found one way they can differentiate themselves is by successfully establishing their brand. Strong branding forces you to clearly identify your target customer base and your core competencies. This
even works for small and midsize companies. To use marketing to successfully attract and retain consumers, organizations must engage consumers in four different ways. It all starts with branding. A successful branding strategy will allow an organization to generate more revenues and to spend marketing dollars more efficiently by: . Building awareness (so customers know who they are) .Develop the right image (so customers know what they are) . Generating leads (to create opportunities for customer trial) . Employing ongoing direct sales techniques (to build, maintain and enhance relationships with customers to increase market share, retention and
loyalty) Additionally, a successful branding strategy will allow an organization to: o Develop and increase customer loyalty o Protect the organization from future competition o Leverage specific products/services o Maintain prices and product differentiation as products and services become commodities o Allow organizations to enter new markets and cross-sell other products and services. The strength of a brand is why entertainment entities such as professional sports leagues and movie and television studios can create and sell products such as clothing, games and collectibles around teams, movies
and television shows. The value of a brand can be quantified. Indeed, it is estimated that the value of the brand was worth $69 billion to Coca-Cola, the world’s most valuable brand according to Interbrand. But you don’t have to be an old-line consumer products company to benefit from a strong brand. It is important to recognize that organizations do not own their brand. Customers and potential customers own the brand; that is, these groups and what they associate the brand with determine the value of the name. To be of value, the brand must be: Relevant - something the customer
external sources such as the ad agency, research partner and others will play important roles. Obviously, a multibillion-dollar corporation such as Coca-Cola has resources not available to most other companies. Spending the dollars on research as outlined in this paper simply may not be realistic, for example. But organizations can creatively still follow the outline. Instead of doing formal research, for example, companies can rely on some of their best customers for input on positioning, creative, etc. In conclusion, when considering the importance of the brand, think about
Marketing this statement. “(Company) is a family thing, a set on constant expectations in the public’s mind…a certain quality; a certain type of entertainment.” Sound like…Disney? If you think this statement could be talking about today’s Disney, you’re right. It could be. But it’s not. But talk about the power of a brand. That statement is talking about Disney. But it was made in 1938, by Walt Disney himself. Process Overview and Ownership Like most other initiatives, a successful branding program consists of three parts: development, execution, and measurement. Also like most other initiatives, the most critical part is
RWzxcvnmlkQZCV what is more or less a commodity product and made it a product of choice by successfully establishing and maintaining a brand. So how do they go about doing it? Is it a book of theories that organizations follow because others are following it or is there a logical science behind the art. If so how come all that follow the prescribed steps do not allways come out on top. How can one know how to ensure that your brand
idea development is in sync with your organizational vision. Who are the stake holders of your brand. Is a brand all about a logo and some advertising dollars. Or is about involving human insight as part of your overall marketing plan. Who within the organization are the brand drivers and who is the custodian. Finally who is impacted by the sucess or faliure of a brand. Read on to know...
JK:PUYTRWQZCV cares about Unique - different from the competition in the minds of customers and prospective customers; something that cannot be Copied - by competition Trustworthy - the customer must trust the organization to deliver on the brand promise (and, in turn, the organization must be able to deliver on it). This article provides a framework for developing, executing and measuring a successful branding strategy. The benefits of applying this framework are: • Developing a systematic branding methodology • Identifying competitive strengths, weaknesses, opportunities and threats as perceived by all relevant parties • Incorporating findings into the development of the appropriate branding strategy, including positioning and online and offline media placement ensuring that your branding efforts will be monitored appropriately. Development of this program will require input from key internal and external resources. These opinions are important frames of reference, and making the effort to obtain these opinions will help ensure organizational buy in. Additionally,
Development Phase I Understand current customer and prospect behavior Phase II Understand perceptions of current customers, prospects and stakeholders and establish benchmarks Phase III Develop strategy, positioning, messages and media plan Phase IV Conduct research to validate strategy and tactics
Execution Phase V
Execute all aspects of the program
Measurement Phase VI
Conduct research to measure changes in target market awareness/image, and monitor changes in indirect measures Phase VII Justify brand investment Phase VIII Plan for re-investment
development. The best way one can be certain that the branding strategy will be successful is to make sure the strategy is developed properly. The following table shows the individual phases in the branding process. While each phase has multiple elements to it, this provides a high-level roadmap of the process. Looking back at the three critical factors that determine the success of the brand, it is the development phase that will ensure the brand is relevant. Solid research and analysis will ensure that what the organization is communicating as its brand promise will be important to customers and prospects. The execution phase will help create a unique brand. By developing strong, memorable messages, and communicating them appropriately, the organization can create a unique brand. It can also succeed in creating a unique brand by developing innovative products and packaging. It is important to note that this process focuses on creating and communicating the brand promise. Equally as important - if not more so - is the ability to deliver on the brand promise. That is what will make the brand trustworthy. Here are just a few anonymous examples of how an organization can fail to deliver on its brand promise, despite great
Marketing
The Brand ROI
advertising: A retailer with great ads, but whose sales are falling because they are not stocking what customers want . An auto manufacturer that advertises safety, but has a poor safety record . A healthcare provider that promotes a quality medical staff, but has poor patient satisfaction scores . A dot.com advertises on the Football World Cup, but doesn’t have the infrastructure to support demand Because of these factors, it is easy to see that ownership of the branding initiative is organization-wide. True, the marketing department can conduct the research and execute the marketing plan, but it cannot deliver on the brand promise. That is the responsibility of the entire organization. Therefore, it is crucial that the marketing department keep all other areas of the organization in the loop as the branding strategy is being developed. If, for example, the strategy calls for a high level of service, the operations department needs to be aware of this. If the campaign will be heavy on direct response marketing techniques, there must be call projections, so the call center can be staffed to handle the calls. After all, not being able to respond to calls is not “a high-class problem;” it’s just a problem.
Development Phase I: Understand the Behavior of Current Customers and Prospects Big companies may invest hundreds of millions of dollars on advertising, promotion and other brand building initiatives. Even for the vast majority of companies that won’t spend that much, they don’t want their investment to be wasted. Therefore, it is important that they commit resources to research and analysis so they know exactly what they are doing before executing the branding strategy. The money they spend on research and analysis can be looked at as an insurance policy to make sure the execution money is
spent wisely. The first step is understanding the behavior and attributes of current customers and prospects. This can best be accomplished by building and then analyzing a Master Customer Information File (MCIF). Note that this is a “build or buy” decision. An unsophisticated database in Microsoft Excel might work very well for some smaller companies. Remember, before computers, many companies tracked their customers’ behavior with handwritten index cards. On the other hand, the more sophisticated the database, the more value can be derived from it. There are companies that specialize in building databases such as this, and these companies have powerful, sophisticated modeling capabilities that can predict behavior and calculate important measures such as lifetime value. Either way, building this MCIF will take several months.
Current Customers The key to having an effective MCIF is not the software it runs in, but the data in it. For some industries, such as banking or catalogs, householdlevel information may be enough. For other industries, such as health care providers or airlines, information at the individual level as well as the household level is important. Customer information that should be in the MCIF includes: . Basic information, such as name and address. Business to business marketers will also want contact title information. . Purchase information, such as type of purchase, date of purchase, amount of purchase, place of purchase, costs associated with the purchase, source of purchase (direct mail, Internet, current customer referral, direct sale, etc.), and any demographic information available from the data source. Descriptive information is also important. For example, long distance carriers will want to know specific markets customers call; airlines will want to
know travel destinations. Interestingly, many nonprofits do an outstanding job of this by capturing this type of information for donations. . Contact information, usually from a telemarketing center or Website. This includes date of contact, reason for contact, source of contact, and all demographic information associated with the contact. . Survey information, especially factors that predict loyalty, such as likelihood to defect, and likelihood to recommend
products or services. Also critical for former customers are reasons they defected, and what it would take to get them back. . Demographic information, such as age, household income, etc. Company demographic information might include SIC code and number of employees. . Psychographic information, such as importance of quality, price, etc. . Lifestyle information, such as computer ownership, hobbies, etc. Prospects While your MCIF will analyze customers, this data needs to be augmented with market information. Market information includes: . General market information, such as census-based demographics, which can be used to calculate market share. . Industry-specific information. For example: . In health care, number of procedures . In financial services, ownership of mutual funds . In consumer products, how much households typically spend on a product type Analyzing the Data There are numerous ways to analyze the data. Perhaps the most critical is to determine market share, market size and profitability by market segment. This enables an organization to prioritize appropriate market segments for retention, upsell and cross-sell, and acquisition. Another type of analysis will identify
Marketing purchase and revenue patterns by customer segment, geographic market, product line and distribution channel to assess utilization and customer value and begin to understand which consumers are most attractive. Finally, other modeling techniques will allow an organization to predict the next purchase of a customer, given past purchases. Phase II Understand the Perceptions of Current Customers, Prospects and
Stakeholders and Establish Benchmarks Phase I is designed to understand behavior. Phase II is designed to understand perceptions of customers, prospects and stakeholders. Customers and Prospects The way to understand the perceptions of customers and prospects, and to create a baseline of their awareness and perception of the organization, is by conducting solid, quantitative primary research, and then analyzing that research. Designing this type of research requires a high level of expertise. For example, close-ended questions should be used exclusively. Therefore, it is very important that the survey is developed and the research conducted and analyzed by market research professionals. This will also take several months, but can be done concurrently with Phase I To be successful, those surveyed must include a representative sample of the population. Additionally, a great deal of thought will be needed upfront to determine the market segments to be analyzed. For example, an investment firm might want to do a market survey, but might be specifically interested in people who own mutual funds. Therefore, the survey design must ensure that enough people who own mutual funds are contacted to create statistically significant results. The information captured to make a survey effective includes: . Demographic information such as age, income, marital status, etc. . Psychographic information such as cost consciousness, quality
consciousness, willingness to experiment, etc. . Purchase behavior such as recency, frequency, type and monetary value of purchases . Purchase criteria, such as importance of price, quality, service, convenience, etc. (essential for positioning) . Satisfaction with current brand on the attributes mentioned above, as well as likelihood to switch and recommend. . Mass and direct media viewing and response habits, including the Internet (essential in creating the media plan) . Other miscellaneous behavior, such as events, activities, retail outlets used, etc. (essential for developing partnerships and sponsorships) . Aided and unaided awareness of the sponsor brand and competitor brands . Perception of the sponsor brand and competitor brands
Once this information is gathered, one can then analyze what is important to these respondents and segments. Results can be analyzed by market segment (i.e., for a healthcare provider, for respondents in a particular payer class). One can also profile people by how they responded. For example, if a software company believes it can differentiate itself on its product’s speed, one can analyze customers and prospects for whom speed is important. Finally this research will enable the organization to establish awareness and perception benchmarks. Research to be done after execution of the branding strategy will measure changes in awareness and perception. In addition to these benchmarks, the organization should also create a baseline for indirect measures such as current telemarketing center volume, Website hits, and current market share. Changes in these measures can also be used to quantify the impact of the branding program. Stakeholders While the most critical input is from
customers and prospects, the opinions of internal and external stakeholders cannot be ignored. The mere act of obtaining this feedback will help ensure buy-in. And the opinions obtained will discern whether these stakeholders’ perceptions are aligned with those of customers and prospects, or whether there is a disconnect. Stakeholder interviews should be obtained through one on one interviews. Also, it is important that somebody not employed by the organization conduct these interviews. That will ensure objectivity, and will also make the interviewees more comfortable to speak freely. Both internal and external stakeholders should be interviewed. Internal stakeholders should include all executives and all department heads. External stakeholders should include community and civic leaders, senior managers and primary contacts at key customers, key suppliers, and the ad agency. These stakeholders should be asked their perception on the organization’s mission, strengths and weaknesses, and opportunities to improve. Where possible, the same close-ended questions used in the customer/ prospect research should be used here. Upon completion of these interviews, the stakeholder responses can be compared to those of customers and prospects. Phase III
Develop the Strategy, Positioning, Messages and Media Plan Marketing can be defined as making sure you have what your customers want, and then making sure your customers want what you have. The first two phases of the branding process constituted the “initial information gathering” component to ensure the organization knows what its customers want. With this information, the organization can now begin to develop its branding strategy, which will lead to the tactics to make sure the customers want what the organization has.
Marketing
Brand ROI
Strategy, Target Market and Positioning This process begins by analyzing all the information to determine “who you want to be.” It might include, at the very top end, developing a mission statement. One thought on the mission statement it should not be about making money. It is a given that everybody wants to make money, even not for profits. Instead, the mission statement should focus on the value the organization will bring to its customers. If the market truly wants what the organization is selling, and if the organization can deliver as promised, the organization will fulfill its mission, and the profits will come. As part of the analysis, organizations should match their self-perceived strengths and weaknesses to the strengths and weaknesses as perceived by customers (whose perceptions are, in fact, reality). This can be done by creating a simple matrix comparing the internal perception of strengths and weaknesses to customers’ perceptions. Combining this analysis with what it already knows about the profitability of customer segments, the organization can define its target market. This definition may include a combination of demographic, psychographic and geographic factors. Then, the organization can determine what attributes (price, quality, convenience, etc.) will attract those customers. Following this, the organization can develop its positioning, based upon how the target market perceives the organization. It can then also develop product development initiatives (not discussed in this framework) to leverage these strengths and address any important weaknesses. This part of the process concludes with development of the positioning statement. A positioning statement should be simple - no more than a sentence. It should focus on who the target market is, and what the customer benefits are. Message and Media Plan With the positioning statement completed, creative concepts can be developed for advertising, marketing collateral, the website, etc. Attention must be paid to tone, product emphasis, graphics and copy. All must be consistent with the positioning, and designed to attract the target market. The media plan, then, will maximize both reach and your cost per impression to the target market. The media plan should include a combination of the following: . Mass media, such as newspapers, magazines, radio, television and the Internet
. Targeted (direct response) media such as newsletters, direct mail, outbound telemarketing and e-mail . Other integrated marketing components, such as partnerships, sponsorships and credibility marketing activities (media relations, speaking engagements, etc.)
Phase IV Conduct Research to Validate Strategy
and Tactics There is one more step before it is time to execute. Focus groups or other interviews should be conducted with customers and prospects, as well as with stakeholders. Note: because focus groups are qualitative, the results may be biased (especially if you have some very strong personalities participating).
This research should test everything done so far to validate the branding strategy in terms of relevance, uniqueness and trustworthiness. It should test the mission statement, overall positioning, copy/tone elements of the media, offers and graphics. If the organization has done its homework right, this research should validate that it is on track, perhaps with some minor tweaks. If not, it is back to the drawing board. Execution Phase V Execute All Aspects of the Program This is the fun part, where the organizations see the fruits of its efforts. It’s a great feeling to see the first commercial on TV, the first Website hit, the first lead and the first sale. The
first tactic to be executed, however, should be internal. An internal branding kickoff is critical to inform and excite the organization about the program. This should be a big deal, with refreshments, prizes, etc. All advertising and promotional materials should be displayed. And, critically, employees must be told why the branding initiative is occurring, and how important they are in delivering on the brand promise. Since delivering on the brand promise is an enterprise-wide responsibility, all areas of the company participated in, and deserve credit for, the launch. There is a logical order in which the branding tactics should be executed. The first tactics are the awareness building tactics. These include activities such as media advertising (print, electronic and Web), development of a Website, participation in trade shows, distribution of press releases, etc. These activities get the organization’s name and message out there, and are the critical first step. Concurrently, or shortly thereafter, the organization can launch credibilitybuilding tactics. These include speaking at conferences, White Papers, customer testimonials, etc. These “third-party endorsements” will create credibility, and will make it easier to generate leads and sales. At this point, the organization is ready to embark on direct response marketing activities to generate leads and/or sales. These activities can include direct mail, telemarketing and, more and more, e-mail marketing. If the awareness and credibility tactics were successful, it should make the lead generation and sales activities that much more effective. Of course, you need to monitor activities closely as you execute, and you need to be prepared to change. But, generally, if you developed your strategy properly, you need to stay the course. For example, a home equity lender once executed a rather controversial direct mail program in implementing a strategy to offer low rates via a personalized postcard. The lender received numerous irate phone calls. One angry respondent even returned a business reply card with a brick attached, making the lender pay the postage for the brick. But instead of giving in, the lender stayed the course, and the program ended up being one of the most successful it had ever done. Phase VI Conduct Research to Measure Changes in Target Market Awareness/Image, and Monitor Changes in Indirect Measures To see if the efforts made a difference, at some interval after execution began (and at regular intervals thereafter), the organization should repeat the
quantitative benchmarking research done right before the campaign was launched. Changes should be noted for the following: . Aided and unaided awareness of the organization and competitors . Image of the organization and competitors . Media usage of people being surveyed to determine most effective media in terms of communicating the brand . Cost per increased percentage point
of awareness/image both overall and by specific medium This not only will measure changes, it will also indicate how to refine the strategy to make it more effective. Additionally, at this point the organization also should measure changes in indirect activities such as call center contacts, website hits and market share. Phase VII
Justify the Branding Investment Over time, the organization can plot the correlation between increases in brand awareness and image to profitability. That way the organization can tie its branding investment to the bottom line. Increase in Brand Awareness Of course, the branding campaign is not the only factor that will impact revenues and profitability. The economy, the competition, and the ability to fulfill on the brand promise all will also have an impact. The “art� of marketing will help determine the precise impact branding efforts have on profitability.
Conclusion A branding strategy, when properly developed, executed and monitored, can provide major benefits. It will force an organization to carefully consider its target market and how it communicates with it. It will also ensure that it has the right processes set up to monitor the effectiveness of its efforts. Most importantly, a successful branding strategy can significantly add to the bottom line, and increase the value of the organization. But branding is more than just developing some ads or creating a direct marketing program. It takes a disciplined approach, and time. It will take at least six months from the time the project begins until the first communications hit. And then the organization has to continually monitor
Marketing results to ensure success. Finally, while the marketing department can take ownership of brand development and execution, it is up to the entire organization to deliver on the brand promise. Great advertising might allow organizations to acquire customers, but only delivery of the brand promise will allow them to retain them.
Reputation risk, the possibility of damaging one’s reputation, presents a threat to organizations in many ways. Little is known, however, about the connections between reputation risk management and social media as a mediated business environment. Following the latest conceptualizations of
strategic reputation management and social media, the paper identifies several challenges for organizations. To make sense of this issue, Transact offers the Social Media Reputational Risk Dashboard (SM2RD) for strategic online reputation management, founded on the metaphor of ambient publicity, which involves not only social media, but also organizations and their stakeholders.
YOUR ATTENTION PLEASE MARKETING MANAGERS INVEST HUGE RESOURCES IN CONCEPTULIZING AND IMPLEMENTING EVENTS TO FULFILL ORGANIZATIONAL GOALS HOWEVER THE EFFORT DOES NOT NECISSARILY GARNER ACCOLADES, BUT IS SURE TO GENERATES BRICKBATS SO HOW DO YOU SHOWCASE YOUR SUCESS log on to www.transactmedia.in/b_net
and share your sucess with the world along with your case study the best rated, most viewed, most commented and most read will be featured in the print copy distributed across industry, showcasing your talent... you know how that will transalate for you professionally and personally and best of all there are no commercial obligations... So happy sharing!!! for details write in to david@transactmedia.in or ryan@transactmedia.in
Business Process
C The way forward for companies outsourcing their processes for better ROI will be verticalization with transparent pricing, highly standardized processes and vertical specific solutions.
ompanies will eventually select service providers that deliver the best quality and value-add, not just the best price. There will be increased adoption of standardized solutions in outcomesbased models will drive outsourcing to become more utility based. Some believe that verticalization and business process utilities (BPUs) will be the major trend in outsourcing over the next five years. Once a distant vision, these dynamics are changing quite dramatically now, precipitated by the recent economic volatility. Companies are eager to shed fixed-cost overhead that they are now willing to outsource vertical applications that are crucial to their core business. Financial and healthcare servicesboth saddled with cost intensive paper processes and legacy data centers-would be early movers toward verticalization. The outsourcing industry is also seeing increasing verticalization in supply chain management. Companies in industries undergoing turbulence around new regulations, or deregulation, such as utilities and telecommunications, will find vertical solutions especially beneficial as it becomes increasingly difficult for buyers to keep track of the legal and regulatory compliance issues across every country in which they operate. Transformation is tough and service providers can not do it by simply polishing up on processes. Lean and Six Sigma are hygiene factors, but transformation requires a technologydriven framework that completely overhauls the processes. The push for vertical solutions and for business transformation will accelerate the pace of the journey to more end-to-end solutions and utility-like platform models in the BPO space. And drive the BPU’s growth.
The Business Impact A BPU (Business Process Utilities) which is a standardized platform for buying business processes, as opposed to highly customized processes in the BPO industry will replace a lot of traditional BPO work over the next 12-18 months. That differentiation accounts for the way buyers will adopt BPU solutions. Some clients will continue to believe that their processes are unique, and will still want BPO services to have things done their
way. BPU adapts well to the needs of small and midsized businesses that want to buy process services just like they buy utility (telecom, electricity) services. Large enterprises will turn to BPU, because they are looking at everywhere they can possibly maximize investments, including existing BPO arrangements. A BPU offers an optimal solution where the business process is a service, and having process optimization embedded into the technology platform greatly enhances the value delivered to clients. Although the market is advancing slowly, buyers are beginning to recognize the ease of standardization, understanding that the platform approach not only meets but in many cases exceeds their individual requirements.
A BPU offers an optimal solution where the business process is a service, and having process optimization embedded into the technology platform greatly enhances the value delivered to clients. Business process utilities will be highly attractive to buyers that want business transformation but want it through quick access to more efficient processes rather than going through a sustained, complex, change management exercise, which is necessary for customized services. BPUs will be attractive to service providers because standardizing processes enables them to leverage best practices and also deliver services at a lower cost. The best targets for shifting to BPU services are processes where one can easily see the beginning and end of the process, those that are fairly independent from other departments in the buyer organization, and processes that can stand on their own. Once a process starts linking to multiple processes, multiple systems, or cutting across departments, it gets a little complex and hence difficult to standardize it onto one platform. Companies that have independent product lines and fairly simple processes and having seldom invested in this because it wasn’t their core business would be very willing to do a BPU
Business Process
VERTICALIZATION The growth of vertical solutions, which will also increase adoption of business process utility (BPU) services in outsourcing, will be a major change over the next five years. Buyers will reap the benefits, including transparent pricing, highly standardized processes that allow best practices, lower costs, and the ability to plug in and out to the service providers that have the best quality and value-add, not just the best price. BPUs will replace a lot of traditional BPO over the next 12-18 months. The differentiation: A BPU is a standardized platform for buying business processes, as opposed to highly customized processes used in the BPO industry. The impact of the BPUs on outsourcing will be dramatic in that it will allow big, monolithic companies to offer capabilities and services to small and midsized businesses, which they have not financially been able to do in the past. Of prime importance in service provider selection criteria in vertical or utility model services is the provider’s commitment to required customer service levels. Solutions in these models include core functions that will impact the buyer’s client relationships and customer service levels.
Over the next five years the business world expects to see outsourcing evolving almost to a telecommunications model because of utility models and cloud-based services. model. Providers standardize how the services are delivered and clients plug the almost module-like services into their performance development system or training system. A good example is e-learning. Over the next five years the business world expects to see outsourcing evolving almost to a telecommunications model because of utility models and cloud-based services. Just as in telecommunication services, one just chooses a set of features and functionalities; it’s invisible and immaterial to the customer how the supplier provides the services. Most business process utilities will be more attractive, initially, to small and midsized businesses and that this will cause competitive pressures for large enterprises, which will eventually start to turn to BPU’s. The impact of the BPU’s on outsourcing
will have a dramatic effect on the provider landscape as the model will allow big, monolithic companies to offer capabilities and services to small and midsized businesses, which they have not financially been able to do in the past. In a BPU, it doesn’t matter to the provider’s bottom line whether it delivers services to a company with 5 employees or 50,000 employees. This will result in the growth of BPU services, which in turn will see more consolidation of the provider supply chain into single companies.
Considerations & Factors
The most important service provider selection criteria for a vertical or utility model solutions is the provider’s domain knowledge level in the buyer’s industry.
Verticalized and utility services include core functions that impact the buyer’s client relationships and customer service levels.
1. Does it have the expertise to support vertically-oriented functions? This is a serious consideration that may appear at first because vertical functions and applications are business critical and time sensitive. 2. Does the provider have skill sets and knowledge for processing financial transactions among providers and payers? 3. Does it have requisite knowledge, processes, and technologies to keep the buyer’s data secure as it flows through various systems? 4. Does the vertical provider have specific expertise in specific areas critical to the business and manage them for compliance, risk, and audit trails? Another consideration in choosing a vertical or utility solution provider is the pricing structure. Does the unit basis for the provider’s services give the buyer flexibility for changing needs? Also of prime importance is the provider’s commitment to required customer service levels. After all, verticalized and utility services include core functions that impact the buyer’s client relationships and customer service levels. Although a BPU is a much simpler
buy and a shorter decision process, a key consideration added to the buying decisions is that buyers need to understand their situations in terms of keeping access to the platform if they decide to switch providers. Do they have any options for migrating? Buyers need to make sure of their exit strategy in case the utility platform ends up not being the right solution for them.” Another important consideration in selecting a service provider in a utilitybased model is that utility computing includes the concept of unified communications. A provider can have the most secure server or data center in the world. But if it can’t deliver information onto the new Android phones and maintain security and information integrity and meet SLAs all the way out to that point, it really isn’t a utility service. Exceptions will change the cost and benefits picture. Buyers need to ensure they select a provider with a proven track record in integrating with emerging technologies.
Cloud
Place in the Private Key migrating steps for your organization
A
s time goes on in high cost technology implementations compounded by rising capital cost implications and falling bottom lines, and keeping in mind the important element of security organizations will invest more in private cloud services than they will in public cloud offerings, but ultimately will use the private cloud as a stepping stone to the pubic cloud, according to Gartner. Moving to private cloud services need not be as traumatic as moving house. Putting in the right foundations now eases the transition and helps when it comes to the shift to cloud services. Even so, the analyst firm
The wholesale adoption of private cloud is in very early days and is still an aspirational notion at best. believes that many organizations will operate a hybrid model that comprises a mix of private, public and non-cloud services for a number of years to come. This approach means that a primary responsibility for IT departments will be to source the most appropriate service provider. However, despite such predictions, it appears the wholesale adoption of private cloud is in very early days and it is still an
aspirational notion at best today. There are two reasons for this - the hype is at an all-time high right now and there are very real commercial and architectural challenges involved.� To add to the situation, many organizations have also delayed much of the necessary internal application and infrastructure renewal work as a result of the still difficult economic climate. But while some providers claim to run private clouds after having introduced a rack of virtualized servers, of which most data centers are working in traditional ways and are in reality just big automated server farms. The sectors that are showing most interest in the concept are those where workload volumes peak and trough and tend to be cyclical, such as retail, distribution, some global manufacturers and public sector. 1. What is private cloud? A starting point when defining private cloud is stating what it is not. And what it is not is simply the virtualization and automation of existing Information Technology. One definition of private cloud is that it comprises IT services that are delivered to consumers via a browser, using internet protocols and technology. Underlying platforms are shared, scalability is unlimited and users can obtain either more or less capacity depending on their requirements. Finally, pricing is based on consumption and usage and is likely to include some form of chargeback mechanism.
Counter-intuitively, perhaps, private clouds do not all necessarily have to run in-house, however. According to latest industry thinking, the term can also be applied to those services offered by public cloud providers, where infrastructure is not shared with third parties as in the more traditional multitenant model. This approach means private capacity is allocated to customers and sensitive information is essentially ring-fenced. Organizations that are adopting this model for undertaking computerintensive tasks, such as large-scale data modeling or cleansing, today include pharmaceutical companies, investment banks, government and the military. Some enterprises are also going down this private cloud infrastructure-as-a-
Moving to private cloud is like moving house - it gives you a chance to clear things out and is a big opportunity for efficiencies. service route for non-mission-critical jobs such as development and test activities. But all tend to bring the data and applications back in-house after such work is complete, whether they then run it on a private in-house cloud or not. 2. Steps towards private cloud The first thing to do when moving down the private cloud route is to analyse your applications portfolio. While many
Cloud organizations know they have up to 50 % of applications no one uses, most would not simply dare not switch them off. Moving to private cloud is like moving house - it gives you a chance to clear things out and it’s a big opportunity if you want efficiencies. The idea is that you only pay for what you use and you only use what you pay for so there’s no point paying for clutter. Once establishing what you should keep or get rid of, brutal rationalisation is the order of the day. This activity will reduce the number of physical servers required if they are not already virtualised. And if they are not, they should be, because it will help organizations a good way down the path to their private cloud goal, and should help to drive efficiencies simply by boosting system resource utilization rates. Rationalising applications, on the other hand, has the additional benefit of making it easier to see which offerings might lend themselves to either on- or off-premise
at least two to three years from start to finish, most organizations have already done a fair bit of virtualisation where possible. Some have also settled on a number of standard builds, but the automation component is less mature. However, least advanced of all is the process and business change element. Moving to private cloud is essentially more of a commercial step than a technical one and the critical question is: ‘If you use less, do you pay for less?’ If not, you’ve not got a private cloud.” 3. Chargeback challenges One of the most difficult areas when moving to a private cloud mode of operation is getting the pricing side of things right. Most organizations today still see IT as a cost rather than a profit centre, which means business users are generally not used to the idea of describing what they require in terms of IT service consumption and related service level agreements (SLAs). They are also generally unaware of what or how they consume IT resources, or the potential cost of doing so.
hole in the organisation. Everyone must pay a fair share, so it is best to create and maintain a set of differentiated limited payment options based on user profiles. To work out this price, it was necessary to establish the cost of purchasing x amount of infrastructure over a five-year period, as well as the capital and operational costs of running the IT department, including employee salaries, maintenance and support. A big chunk is the central infrastructure that everyone pays for, but then it has to be broken down and apportioned to a fixed cost to certain applications and that helps to build a profile that can be reviewed each year. The users can refer to a spreadsheet and see if costs go up or down and whether there is a need to change the pricing model. But to get to the ‘if you use less, you pay for less’ model, the next step would be to introduce metering tools, which are still relatively immature and could best be described as ‘evolving’. However there is a need to think about the basis for charging, as what’s
Moving to private cloud is essentially more of a commercial step than a technical one and the critical question is: ‘If you use less, do you pay for less?’ If not, you’ve not got a private cloud.”
But to get to the ‘if you use less, you pay for less’ model, the next step would be to introduce metering tools, which are still relatively immature and could best be described as ‘evolving’.
For service economics to work, the organization needs to know how much it should actually be spending on such services
private cloud provision, and which are commodity enough to farm out to the public cloud. The next step, meanwhile, entails standardising hardware and software stacks. While this move does not necessarily mean making only one architectural option available, it does imply limiting the choice of builds so that anything outside the norm is considered a special case that must be justified. The aim here is to make the infrastructure both more performant and costeffective in procurement and management terms, but disciplined IT governance is required to maintain such standardisation and resist bowing to pressure to introduce exceptions. Once all these activities are complete, it then becomes possible to start undertaking management automation, while ensuring that processes are orchestrated based on strong policies.But migrating to a more service-oriented approach using tools such as Itil is also important, as is understanding user consumption and service economics, with the aim of introducing chargeback mechanisms. While such a journey is likely to take
So to help them understand such costs, introducing at least a nominal notion of chargeback - even if it involves just showing them a bill of what such consumption costs in theory - is crucial. From an IT perspective, this process entails working out the cost structure of the IT services provided and establishing how much to charge based on both
equitable in one business won’t be in another. So if the organization is transaction-based, there can be an option to look at the number of users and system updates. However for an information-based organization, it might be fairer to look at the number of hits on a database. But a lot of people have to blend in, so the entire process needs to be thought through. However, if moving to a private cloud service provided by a third party, working out consumption, usage and the difference between cost and price is equally crucial. If the organization does not know what is consumed and what was the usage was before transition, It becomes impossible to know if the users are paying more or less? But in most cases, historically, services on a payas-you-go basis costs more. This situation arises because, while they may agree, for example, to pay/user/ month for a given service at a rather inexpensive rate, they generally still have in-house IT personnel and infrastructure to pay for. For service economics to work, the organization needs to know how much it should actually be spending on such
consumption and pre-defined SLAs, to introduce a pay-for-use model. Again, standardisation is a key word here, the idea being to make it financially unattractive to deviate from standardised services or SLAs by making clear what such a decision will cost. Cloud providers’ billing is not very sophisticated for utility computing, but it’s important because that it does not make IT to act as a black
Cloud
As an organization exploring with the unknown, care must be taken to ensure a safe exit if required So, don’t end up being locked in from the start. Some migration tools are great, but some lead to early lock-in. This will cause serious problems if you need to exit and migrate. services. This also enables conversations with the business about how much they really are consuming. 4. Organisational and governance challenges Another key challenge when moving to the private cloud is that many business units are simply not used to the idea of sharing infrastructure with their colleagues. This mindset means dialogue will probably be required to help them understand the corporate benefits of no longer having access to their own dedicated resources. But for the IT department itself, moving to a share-and-share-alike model also implies changes to how resources are managed. This change in turn means the IT function may need to be reorganised or rationalised as staff will no longer serve particular business units but instead manage common services. For example, a retrained group of IT specialists, to become members of more generalised teams covering broad areas such as storage, virtualisation and verticalization helps the organization leverage virtual teams. But personnel may also require skills development in areas such as commercial or vendor management to negotiate external private cloud arrangements or procure-to-pay models, for example, if cloud-bursting out to the public cloud to gain extra capacity. Another consideration, meanwhile, is ensuring existing processes are fit for purpose in the new world, or they may act as a roadblock to commissioning new services, for instance. Moreover, if signing up to a third-party private cloud provider, it may also be necessary to review and extend existing risk frameworks to ensure they cover both internal and external datacentres located in potentially different regions of the world. Such activity will entail embedding audit rights into contracts and ensuring security rules are set based on corporate policy. It will also involve ensuring that risk assessment frameworks are updated every six months rather than the current average of 18 months, to cope with potentially rapid change. 5. Lock-in challenges
Achieve economies of scale
– increase volume output or productivity with fewer people. Your cost per unit, project or product plummets.
Reduce spending on technology infrastructure. Maintain easy
access to your information with minimal upfront spending. Pay as you go (weekly, quarterly or yearly), based on demand. Globalize your workforce. People worldwide can access the cloud, provided they have an Internet connection. Streamline processes. Get more work done in less time with less people. Reduce capital costs. There’s no need to spend big money on hardware, software or licensing fees. Improve accessibility. You have access anytime, anywhere, making your life so much easier! Monitor projects more effectively.
Stay within budget and ahead of completion cycle times. Less personnel training is needed.
It takes fewer people to do more work on a cloud, with a minimal learning curve on hardware and software issues.
Minimize licensing new software.
Stretch and grow without the need to buy expensive software licenses or programs. Improve flexibility. You can change direction without serious “people” or “financial” issues at stake.
Some elements of IT services, such as data workflow and business logic, will always remain inside companies as they’re unique differentiators. But most common services will move to the cloud as Organizations attempt to get the lowest cost execution model, unlimited capability and flexibility to change. A key issue when thinking of employing the services of an external private cloud provider is the difficult matter of lock-in. While many cloud vendors now provide migration tools to help customers convert their applications and data to run in their environment, not all are based on open standards. As an organization exploring with the unknown, care must be taken to ensure a safe exit if required So, don’t end up being locked in from the start. Some migration tools are great, but some lead to early lock-in. This will cause serious problems if you need to exit and migrate to an another solution. The same applies to developing new applications in a private cloud environment. In such a situation there is a need to abstract the management layer from the vendor’s application development tools and frameworks, because if the organization develops in those frameworks, it may find that the developed application needs that framework to run and this will be the start of a lock-in. This situation means the CIO or enterprise architecture group will need to make strategic decisions in areas where they may not have done so traditionally, such as what application development toolkits and frameworks to use. They will also need to define data standards and suitable means of managing master data to ensure that switching between clouds can become a reality rather than simply a promise. But, while there may be a lot to think about, putting the right foundations in place now will make life much easier in future when trying to source the right type of service for the job, whether that is an internal or external private cloud or public cloud option. To sum it up, some elements of IT services, such as data workflow and business logic, will always remain inside companies as they’re unique differentiators. But most common services will move to the cloud as Organizations attempt to get the lowest cost execution model, unlimited capability and flexibility to change.
Big data
The business of Banking and Insurance like all forms of transactions or businesses are looking at maximizing every rupee spent or invested. Firms are always looking for opportunities to increase efficiency and effectiveness. In spite or because of the recent economic downturn, many firms are continuing to invest in new technology solutions to further reduce risks and control costs.
A
recent survey by a business publishing company, along with some user organizations found that a majority of firms surveyed expected to increase their IT investments. For these firms, a “migration to quality” is seen as a means of mitigating the impact of market volatility. Portfolio management systems, whether in the form of an onsite installation or a fully outsourced hosted solution, continue to evolve towards greater efficiency. At most firms, the portfolio management system is the core of the investment operation. Migrating to a new one, particularly to a new provider, is a significant undertaking, but the potential efficiency gains and cost savings should make it worth the effort. Still, firms are often reluctant to change systems, despite the opportunity for process improvement, when they confront the issue of converting data from their current system to a new one. The process of data conversion is long, daunting, and costly. Firms are concerned about disruption in client service, and regulatory requirements are unclear. Many firms mistakenly assume that the only sure way to remain in compliance is to bring all their data, including historical client data, from the old system to the new one. The concern is often sufficient to cause a firm to cancel or defer its plans to change systems— which means missing out on the opportunity for improvement and ceding an important competitive advantage.
A Targeted Approach
Fortunately, and surprisingly to many firms, the SEC does not require firms to convert their full transaction history when changing systems. Specific details of every buy and sell or the movement of cash and securities into and out of accounts can be maintained offline and can, in fact, be discarded after five years.So despite what many firms mistakenly believe, an “everything but the kitchen sink” approach to data conversion is not required, or even advisable in many cases, when migrating to a new portfolio accounting system. Instead of doing a blanket data conversion, firms should work with their technology providers to determine their reporting and continuity requirements, and perform a targeted conversion of only the data required to support their actual business needs. This approach keeps the scope and costs of conversion manageable and results in a smoother transition. This document is designed to help understand data conversion options and will outline: O What data is critical and must be converted for compliance purposes as well as business continuity. O What data is not critical and options for dealing with the relationship between data conversion and reporting. O What to look for in a new technology provider relative to data conversion. The inherent challenges of data conversion in a system migration are not insurmountable. They certainly should not deter a firm from exploring new technology solutions when it becomes apparent that the current system is not sustainable for the long term. Understanding data conversion will enable you to ask more specific questions and make a better informed decision on a new accounting platform & technology provider.
Identifying Critical Reporting Data
Converting historical client data is not impossible, but firms must ask themselves how much is truly necessary. The more components of data you try to bring over from the old system to the new one, the more prolonged the conversion process and the higher the costs. And, because
different portfolio management systems use different databases and offer different reports, a firm might go through a painful and expensive conversion and still not get the reporting capabilities it was expecting. The
key is understanding the relationship between data conversion and reporting. A portfolio management system essentially has two components: a database and a report engine that queries the data in the database in order to generate reports. Only certain reports require historical data. The question to ask, therefore, is: what reports are essential for your specific business needs, and which of those reports will require historical account data? Judging from system implementation and data conversion experience, most firms require two basic report types that necessitate a data conversion in order to run reports that draw on an account history: time weighted rates of return and current open tax lot information, with original cost and purchase date. Cost basis information enables the firm to run appraisal reports as well. In most cases, these reports are sufficient from a regulatory and business continuity standpoint. performance history data enables a firm to continue to provide account returns as a service to clients. It also helps from one system to another, if necessary for
compliance purposes. Ask your prospective technology providers about their experience in converting the data needed to run these types of reports. A qualified implementation team should have the expertise, tools, and a proven process for extracting the relevant data without bringing over extraneous account information. Firms may have reporting needs above and beyond those cited that require historical data. Discuss your particular reporting needs with your prospective technology provider. They should be able to tell you which reports require historical data—and which data those reports require. The point is to start from the perspective of your business needs and work back to identify your data requirements, instead of assuming you need to bring all your old data over “just to be safe.” And remember, just because data is not converted does not mean it is unavailable. It is simply housed in a different system and can be retrieved if ever necessary.
Transaction History
What Is the SEC looking for? The SEC is accustomed to firms changing systems and, as noted earlier, does not require or expect every historical trade detail to be migrated to the new system. To the extent that examiners are interested in transaction history, they will be more reliant on third-party documentation than on a firm’s own records. Investment firms are required to maintain custodial records, which contain all the historical transaction detail required. A key area of SEC focus is performance measurement, especially when firms use their performance history in marketing and promoting their services. For as long as a firm has shown performance, it must maintain customer statements as well as the market values on which calculations are based. Although compliance with the Global
Investment Performance Standards (GIPS) is not an SEC requirement, it is increasingly regarded as a de facto industry best practice. If a firm claims GIPS compliance, it must also adhere to the GIPS guidance statement on record keeping. The guidance requires firms to support both portfolio and composite performance calculations for all periods of performance the firm is showing. This necessitates not only individual account time-weighted returns, but also the market values that were used for weighting account returns in the composite.
Historical Data Management
None of this, however, requires the conversion of detailed transaction history to the new system. What, then, should you do with it? SEC book and records retention requirements call for data to be maintained for five years in most cases on: (i) Micrographic media, including microfilm, microfiche, or any similar medium; or (ii) Electronic storage media, including any digital storage medium or system that meets the terms of this section.” In the latter case, there are a number of storage alternatives available, including onsite or offsite server-based data storage systems or online, hosted storage services. Your solution provider should be able to advise you on the data storage and retrieval option best suited to your business. Capturing and storing massive amounts of data from an old system in a retrievable format is do-able. Pro’s and con’s depend as much on the firm’s culture as on regulatory requirements—there are some packrats among us. That said, trying to populate a new system with massive amounts of historical data is counterproductive. Get what you need and move forward.
Evaluating Expertise: What to look for?
As you begin to review new options for portfolio accounting, remember that you are not only evaluating systems, but also the companies behind them.
Big data Can the provider migrate your firm effectively to the new solution, including the conversion of the essential data you need for business and service continuity as well as for compliance? A prospective provider should take the time to discuss your business needs and make a recommendation as to which data should be converted to support your requirements. That means: What are your day-to-day business requirements? What reports do you need for clients/for internal management? The people overseeing the migration should have significant experience in converting data for investment firms. They should know the SEC and GIPS requirements and recommendations, and help you distinguish between “nice to have” and “need to have.” They should have a proven methodology for converting essential data, and a dedicated experienced conversion team.
Moving Forward
Transition to a new portfolio accounting system is not simply about having the latest technology but an opportunity to review and improve workflow processes, streamline operations, align with industry best practices, and strengthen efforts to retain and attract customers. The challenges
of data conversion should not hold you back from realizing these advantages. It is neither
practical nor necessary, from either a business or regulatory perspective, to convert all historical client data to a new system. There are several options for dealing with legacy data based on your anticipated needs. With expert guidance from an experienced solution provider, you’ll be able to identify and limit the conversion to the specific data you need to meet your business requirements. That allows for a faster and less costly transition with minimal disruption, and enables you to start reaping the advantages of a new solution right from the start.
Customer Relationships
through your customer database
M
any businesses neglect their most valuable asset - own in-house database. Why? Most think it is too time-consuming to keep updated or simply fail to realise the potential goldmine from using and maintaining their database. Most business strongly believe that marketing phrase “Content is King”, however, you may not have heard that “Database is Queen”! Your database must be viewed as a valuable resource asset. It is a potent source for marketing intelligence and a powerful tool for building relevance while nurturing relationships. To make the most from your database you must be able to answer these five questions: 1. What is your database objective? 2. What information is available? 3. How can the information be captured? 4. How can you use the information? 5. What is the best way to segment your database? Let us examine the above 5 points. What is your Database Objective? The majority of businesses when asked this question will respond that it must be used to generate sales revenue for the business. This is absolutely correct! In most ‘Business
to Consumer’ (B2C) situations this can be very significant sales revenue generators for the business. There are other valid objectives, these could be as follows: o Customer loyalty building o Nurturing prospects into customers o Engaging and building an active referral network o Data mining for market research and intelligence gathering and analysis o and many others... What is available information? Take an inventory of what information is available in your organization for both prospects and customers. Looking back at your database objective/s determine what additional information will be needed. Consider how you can consolidate all the information from various sources into one online centralized database for your organization. Keep in mind that depending on the length of time since the information was recorded that it may no longer be up-to-date and would likely have changed. Various statistics indicate that database information may churn at a rate from 20% to 35% per annum. How can information be captured? This is where an organization must be smart and committed to implementing
an organization-wide process for collecting and updating the database information on an on-going basis. Database information must be captured at all contact points with your prospects and customers alike. These can occur in online situations like web contacts and emails or from traditional face-toface contact, phone conversations, networking and postal communications. The use of an online centralized Customer Relationship Management (CRM) system is highly recommended to ensure continuous availability for all users so that updating will be encouraged and enforced. Most comprehensive CRM systems will be able to capture various communications received and sent from contacts in the database. How can you use the information? The database can and must be used to identify market opportunities for your business. To do so you must segment your database to identify a target audience that you can develop as a potential market for your business. An intimate understanding of this target segment of your database will allow you to create a highly focused marketing message that will be relevant and resonate with the group of your selected prospective customers. Consider the following five points as a guide in defining your database segments: o Right size to be worth the effort o Actionable information o Make the segment specific o Unique & identifiable o Discernable link to the focus of your planned message What is the best way to segment your database? You can be as creative as you want with ‘lateral’ (define) thinking in the process. Be guided in how you can achieve better results with a more competitive and compelling offer that your business can deliver for the target segment. In general, there are four commonly used methods to segmenting your database, these are: o Financial: Revenue generated from the customer transactions o Demographic: Describing customers in terms of their personal characteristics o Geographic: Describing customers in terms of their physical location o Psychographic or Behavioral: Describing customers in terms of their preferred activities and actions Treat your database as your friend, love and nurture your database and it will pay you many times in return! Every business no matter the size of the database must get a better return from the database and engage marketing programs to monetize the results from your efforts
Technology Core Solutions
B
anking is evolving as new laws and market requirements are forcing banks to modernize their core systems. Regulators and investors are demanding improved data and transparency. Customers are looking for integrated services across a growing range of channels. And competitive pressure is driving a constant stream of new products and services. Banks have and continue to invest a lot of time and money in technology platforms and those systems have served them well. But now, after decades of quick fixes and workarounds, those same systems are holding them back. Time for a change Most of today’s core banking systems were built in the 1970s and 1980s and after countless modifications and add-ons have become so complex and convoluted that it may be difficult to fully understand them. This can make it hard for banks to comply with regulations and determine adequate controls. It can also make the systems difficult and expensive to support and improve. Many banks spend half of their IT budget simply maintaining their core systems. And whenever they need to modify their systems to handle a new product, channel or market, they run into a brick wall. To make matters worse, it is increasingly difficult to find IT staff with the right skills to support the legacy technology. The good news is that viable and demonstrated solutions to these challenges have started to emerge. Many modern core banking platforms are flexible and scalable – designed to adapt to a bank’s changing needs. Also, many of these platforms are designed to provide real-time capabilities to improve the customer experience and help banks manage risk more effectively. Replacing a core system can seem overwhelming. Here are some ways to
Read on for how to do it...
overcome inertia and manage risk. Redesign processes. Be prepared to replace processes and workflows associated with your core banking systems. Most solutions come with highly recommended state-of-the-art processes. Choose vanilla. Differentiate your business by developing new products and services – not by modifying the application code. Customized code can make it hard to maintain and upgrade the system, which can be disastrous in an environment where regulations and market requirements are constantly changing. If customization is unavoidable, design it as a bolt-on
that doesn’t interfere with the standard platform. Manage change. A major systems replacement is not just a technology change, but a fundamental transformation of the business. Actively preparing for the shift and addressing organizational resistance is essential. Commit to the journey. Core systems replacement should be handled as a series of coordinated improvements, rather than a big bang. Reaching the destination can require a sustained commitment over many years. But finally the organization has to be ready and raring for this make over. It is a tough call to make.
Core Solutions
RE-INVENTING
T
he case for transformation is evident as time goes on technology lags. Leading edge processes, mapped 10 yeas ago have over time been overpowered by more relevant issues, mapped by improving technologies and experiences. Multiple software marriage makes systems cumbersome. Some organizations have tried to hot-wire aging systems to improve performance, but this is not a long lasting solution and is definitely not something that will make business sense as you keep adding without knowing where it will take you. Yes evolution is required but what if this means that you continue to carry legacy systems that instead of pushing you forward, tend to hinder progress. Replacing these systems may well be the best way to reduce complexity and support business growth.
Another recent development in today’s scenario is that technology decisions have a set of new masters. It has the CIO has a influencer from a decider and the new set of functional heads now drive business considerations and decisions.
But the processes and tools for CBS replacements have improved considerably, and research shows that banks that have rebuilt the CBS in part or in full have achieved measurable performance improvements over peers. Struggling with dated technology Core banking refers to a bank’s basic functions, such as gathering deposits, making loans, and managing corporate cash. The systems that support this core—emerged with the introduction of mainframe-based transaction processing. These systems allowed banks to coordinate operations centrally, creating a dependable, if rigid platform, designed to handle large volumes of transactions efficiently and with minimal downtime. The systems served banks well until the IT environment changed markedly, and web communications, network computing, and plug-and-play system design emerged as keystones of high-performing platforms. Older core systems were ill-equipped to support the range of functions, modularity, and scalability needed by today’s financial institutions. During recent years, merger’s and changing operating environments have brought complexity to breaking point. Internet has increased demands to deliver banking services over new channels, such as mobile phones. One private bank found that its old and unwieldy CBS was severely hurting its
ability to control costs. Manual workarounds and a burgeoning volume of custom applications ran up a huge bill. Elsewhere, a rapidly growing bank dedicated to serving emerging markets fell behind its rivals in delivering online banking services. Its CBS platform—a mix of incompatible packages and in house applications-made it hard to aggregate account data across the business.
Hotwiring and rebuild systems
Over the past decade, many attempts to replace systems wholesale went awry. Top Guns, with plenty of other to-dos on their lists, often delegated the entire project—from planning to implementation—to vendors, removing themselves from the governance process. Vendors, through no fault of theirs, lacked cultural context and often created architectures that didn’t fully mesh with business priorities. Cost, quality, and implementation issues ensured that less than 30 percent of the first generation of replacements succeeded. Inflexible systems, equipped to handle only a narrow set of functions needed costly update application custom fixes. Building such systems in-house has proved to be difficult and resource-draining task, thereby questioning the overall system transformation business value. Healthy margins before the recession cushioned many banks from the need to act. A lot of financial services heads argued that rapidly changing technologies made it hard to know which standards, applications, and packages would endure long enough to pay off the investment. The financial crisis, however, tightened margins for most banks and forced a relook at existing systems and processes forcing it to think of ways to improve potential and performance.
Changing ecosystem
With passing years, management’s technology understanding and technology have matured and have brought improvements in planning, project management, and platform design. Global players have emerged out of a fragmented vendor market, bringing better technology capabilities and superior skills in coordinating large and complex projects. Nextgen software platforms, better ability to deploy and integrate emerging technologies, have marked improvement.
Most transformations can be completed in less than five years, compared with a decade or longer when customization, resource constraints, and unforeseen business requirements bogged down
projects with time and cost overruns. Banks are investing more
up-front time—on average, one-third of the total timeline for projects—in the planning and evaluation process, which helps reduce errors, build organizational consensus, and speed implementation. Recently completed project studies show that enterprise wide systems replacements can help banks achieve higher asset and pre tax-profit growth and thereby better cost control. Of those institutions that made the transition, global tier-one and -two players operating in mature markets saw their pretax-profit growth rate accelerate by up to 30 percent, and their ROI ratios improved significantly.. With the economics improving, the number of packaged solutions has grown by about 20 percent annually since 2004, a volume that is expected to rise sharply over the next ten years as more systems reach the end of their lives.
Pile driving change
The most effective enterprise implementations share similar characteristics: top-down planning, IT architecture development anchored in business needs, and a new partnership role with vendors. Companies can draw on a number of best practices. Take a domain-based approach to architecture development banks have legions of disparate processes that focus on customers, lines of business, and day-to-day operations. These are often further segmented by region or business unit. Under the hood, however, many rely on similar software and deliver similar capabilities, such as managing customer data, handling transaction flows and ledger items, and preparing statements. Next-generation CBS platforms take a modular (or domain-based) approach to architecture development. Because domains classify IT processes and their enabling applications by what they deliver rather than by business or process owner, they can help streamline the IT environment and standardize many common requirements. A bank might have 100 customer interaction processes that vary by product type, region, or income. Each might run separate software applications, even though the basic functional requirements may not vary substantially across the bank. Rather than housing multiple instances of the programming code required to run these processes, good CBS architectures bundle the common capabilities into sharable domains accessible by businesses that need the capabilities.
Business driving the IT strategy
Core Solutions Driven by business requirements and presented with basic business terminology, a domain-based approach helps cut through complex technical specifications and thus invites greater engagement between IT and business professionals. It also makes it easier for managers to examine the mix of domains in their portfolios. That in turn helps managers distinguish between activities, such as settlement processing, that are generic to most lines of business and might be suitable for standardized packages, on the one hand, and specialized, high-value activities, such as Islamic banking, where the market opportunity might justify customized design, on the other. The result is a simpler, more costeffective and responsive architecture framework that corresponds to the needs of the business.
A leading bank struggled with a tangle of applications that hampered its retailbanking operations. The organizational structure was decentralized around its city and regional branch network; the IT unit in each major metropolitan area supported several local branches with core IT and back-office services. Because each center functioned more or less independently, there was little cohesion among the bank’s many locations.
Long-term vendor partnerships
Replacements touch so many aspects of the enterprise architecture, hence selecting the right outside vendor can make or break a project. In the past, IT project managers, who often had little direct consultation with business leaders, commonly led the selection process. With the growing awareness about fostering long-term business
transformations, senior business and technology leaders are now more likely to be closely involved in the planning and selection process. One multinational bank, for instance, devoted two years (one-third of the total project timeline) to planning the engagement with the vendor. During this period, the two partners carefully defined the business and IT requirements, established key performance indicators (KPIs) and performance milestones, and piloted a couple of small programs to test and refine the new architecture framework. Rather than viewing the vendor as a plumber engaged to hook up the pipes in the IT environment, the bank turned to a provider it could trust to serve as a full partner and adviser with specialized experience and a track record in managing large-scale projects. Putting CBS to work Two different implementations illustrate how these principles can be put into practice. One involves a large, established bank where cost and complexity issues needed urgent attention, the other a developingmarket bank where growth and speed to market were critical for continued success.
Slashing complexity
This arrangement gave the bank flexibility in localizing its service. Nonetheless, the lack of unifying application standards created logistical snarls in satisfying bank wide business requirements, such as speeding time to market for a new corporate cashmanagement product or rolling out a new Internet security update. These problems in turn led to a logjam of custom fixes, which subverted the company’s cost structure. Despite a cost reduction drive, IT spending remained substantially higher than it was at the company’s peers. Its rivals were not only spending less overall but
The result is a simpler, more cost-effective and responsive architecture framework that corresponds to the needs of the business. also more successful at directing funds toward new, growth-based initiatives. The bank’s merger history added other constraints to the underlying architecture. Siloed data made it nearly impossible to create a single customer dashboard. While competitors went to market with integrated product suites, the weak linkage between applications hindered similar product-bundling opportunities. And, legacy limitations and aging software architecture made it hard to winnow out older projects as new initiatives came on board. As a result, the technology environment expanded: It would hence be necessary to eradicate redundant programs, shrink the number of applications, and reduce the number of developers, servers, and storage devices needed to support the legacy architecture. It also required scaling back the IT portfolio, eliminating non-priority initiatives, and ensuring that the remaining programs
better served the strategy. Having decided on a domain-based framework, the planning team brought all stakeholders to assess their business needs and to determine which capabilities could and could not be shared. The planning team thus created a new service-driven architecture. In some cases, pockets of financial services activity required highly specialized product features or applications tailored to the needs of nontraditional clients. In these instances, the bank worked with its vendor to custom-build certain elements of the CBS system. For the vast majority of the bank’s needs, however, it could leverage the shareddomain approach to standardize application development, thus shaving costs and time. Standardizing around a core set of banking applications allowed the bank to eliminate the redundancy that riddled its prior IT environment and to bring its cost structure back into lin . What
in the past would have taken a bank of this size nearly a decade to complete was now expected to take five years. In fact, just two
years into the implementation cycle, the bank has already closed 50 percent of the cost gap with its competitors.
Business Intelligence
Exploring relevance of implementing analytics for shaping a bank’s business and the various critical factors for its success saving on cost and time. In a nutshell, the RBI’s Committee on Technology t’s a given that financial services business departments can employ their Upgradation had advised Indian banks organizations need to be accurate time to analyze their performance in to put an EDW strategy and banks with a with their reporting and be compliant real-time rather than wait for days for the large number of computerized branches with all regulatory requirements. same. have started their pilot projects, progress Organizations need to be sensitive has been slow. Of late banks have been to changing market dimensions and Adoption:The success of a self-service investing in Enterprise Data Warehousing analytics and reporting technology lies customer requirements. They have (EDW) that can provide logical linkages to be sure about products they sell in its adoption. The implementation of to various sets of information available to customer segments, where they banking analytics should be considered across a bank for better analysis and sell (delivery channels) and whether it to be a strategic business initiative and for obtaining a holistic picture of their will be profitable or not. With 20% of not as an IT initiative to decentralize business. customers accounting for 80% of their reporting. This requires executive profits, banks are still working their Banking Analytics (business intelligence sponsorship and participation from all of way out to focus on cost management, derived from the EDW-data available the departments in a bank. diversification into multiple fee-based from all LOBs) can provide metrics to Plan and Rollout: It will be critical for income streams and optimizing capital quantify customer value and deliver banks to plan the rollout and choose utilization. They are finding new ways to capabilities to provide a competitive the best path for adoption. A phased steer and overcome challenges through edge. Such analysis will be critical in department-wise rollout is a good idea. a relationship driven model to engage driving performance as well as managing Milestones need to be set and the and endure customers. Customers do cost and revenue in the banking industry. involvement of all stakeholders who not want pre-configured products, they Financial services providers will be able need to be aware of project risks and want products that are differentiated to track various metrics across LOBs outcomes is required. Project deficits and meet their specific needs. The key including NPA movement, recovery with respect to functionality or access will lies in a bank’s ability to segment and analysis, deposit renewal analysis, impact adoption. profile its customers to gain a better deposit overdue analysis, deposit preBarriers: The ability of bankers to adapt understanding of their potential worth. closure analysis, customer acquisition to a new environment and acquire skills What is required is for banks to achieve and attrition analysis, bills growth to take proactive decisions assumes a 360 degree view of their customers analysis, bills exposure analysis, cash great importance here. Traditionally, from the existing silos of Lines of flow, profitability analysis and much especially in the public sector, there has Business (LOB) that would include more. been dependence on MIS departments credit cards, insurance, investment Financial organizations have the choice for data and reports. With the advent of banking, trading, retail banking etc.. to build their own BI capabilities or business analytics, this dependency has Going beyond would be analysisto buy off-the-shelf solutions that are to come to an end. Moreover, there will specific areas such as channel usage specific to their LOB’s needs—say, for be challenges with respect to age (the and optimization, treasury, trade finance, loans, deposits, profitability, customers average age of a PSU banker is over 45, derivatives etc. etc. Once BI is available bank-wide, Source: RBI) which may act as a barrier Core groups of subject matter experts users should be able to access and and prevent them from getting hands on (SMEs) have been constituted to ensure analyze reports based upon their roles with the solution. the data accuracy and integrity of data and priorities. This eliminates a user’s flowing from a bank’s core applications dependence on MIS or the IT department Knowledge Transfer: Banking analytics is a cross-organizational initiative and to the RBI. Such initiatives make it clear for his or her effort to churn out varied this will impact every department and that the banking system is going through reports. While CXOs get an executive hence all business managers and a rather vigorous exercise of enhancing view of dashboards that are specific senior executives must be trained and data quality with stress on information to their interests, the business users encouraged to use this tool. Even if management for both risk as well as (based on their access privileges) can we were to assume that the number of productivity. The challenge is in realdownload reports in the format of their employees to be trained will be around time co-relation of the large volumes of choice (Excel, Word, PDF etc.). They get data to derive meaningful information or access to a multitude of analytical reports 1,200 (around 10% of the employee average across all banks, Source: RBI patterns for what it means. While there from the integrated data repository. The 2008-09), the magnitude of the activity can be no single method to deal with the reports can be through preconfigured becomes clearer. Perhaps, the bank may cleansing process, banks have realized graphs or charts that you can drill down want to use e-learning as an effective this and are working towards completing into for multiple levels. Coupled with the tool. this humongous task. A clean, accurate power of collaborative tools (such as historical data repository in the form of portals and messaging), bankers would Raghuraman is a Senior Technology Business Professional a data warehouse would help banks to be able to meet and resolve issues via with a primary focus on enterprise solutions for the not only expand their business, but to Financial Services sector. secure, online discussions, thereby He can be reached at raghuraaman@gmail.com prevent/mitigate losses as well. Although
I
worldwideweb
M
ost financial services organizations alraedy have a holistic web plan in place, but to create a new property with specific features, it is critical to know, how do you go forward. Unless you are an experienced web developer, any business would need the services of a web development firm for your web business. A few key pointers while evaluating one for your business. For simplicity, we have used the terms ‘web developer’ and ‘web development firm’ when it includes a range of vendors from a small firm of a designer and developer to a full service digital agency.
Short listing • A bit of window shopping before you finalize the developer. • Ask for references from people who have good results. • Speak to referees and past customers • Assess past work. • Ensure that the development team provides you technical as well as business services. • Ask questions to assess their technical and business skills-sets. Most importantly check • For local firms or even check out the services offered on web based marketplaces (always check recommendations). • The web site of the short-listed developers to assess the professionalism in their own site. You will be surprised how many vendors have a sloppy site, with bad language, spelling mistakes or appear to be dated for years. • Web sites they have developed. Look for programming bugs, spelling mistakes
and other errors that would indicate work quality gaps. • The experience level and results of key persons who will work on your site development. • Timely project completion. • For resource availability, software library and components that may speed up development. If they are a full service provider, they should ideally assist you in the full range of technical and creative services including: • develop your business web strategy • suggest and book domain names • decide technology platforms • find a hosting solution • implement graphic design or provide good recommendations • write text content • do programming and testing • optimize site for search engines and market it on the web. One person or a small team need not have all the skills. They should
worldwideweb
Working with Spiderman
have access to a team or network of specialists to help provide you a complete solution. When choosing a full service agency, ask them about the services they will directly provide and what will be contracted out. Get a separate quotation for each service. Most firms specialize and excel in one or few aspects – design, development, marketing and so on. Try and gauge the areas of excellence and encourage the firm to outsource the other areas to specialists. After initial short-listing, it is time to invite proposals. Ensure that proposals outline all major heads of cost and all potential expenses are known to you right in the beginning. Clear cost basis If the proposal is based on Time & Material basis (T&M), it means that you will pay for labor per hour. Ensure in advance that project durations and clear end date is given so that you don’t pay for a project that stretches endlessly. Is there a system to log hours spent on your project? Will the log book be periodically shared with you? Is the rate per hour reasonable and drawn for all roles who would be involved in the project? Is this inclusive of duties and levies such as service tax? f the proposal is based on project basis, it means that you will pay for the end product irrespective of how much effort goes into it. Ensure that all enddeliverables and specifications are clearly mentioned so that both parties know the stage at which the project is over. In both the above cases, ensure that estimate of time is given and check if there is any penalty for delays on the part of the vendor. Check that the quote covers not only labor but also expenses such as domain name registration, hosting and any software tools that may be required for development. You must insist on having a total cost estimate. Requirement specifications It is essential that in the first phase of project, clear requirements specifications are developed; the project moves according to those specifications and any updates are recorded and agreed to. As a promoter, think about the functionalities that you need in your web site - content, graphics, forms, alerts, processes and reports - and freeze the requirements only after developing clear requirements. Try not to change your requirements once the development has started. However, if there is any deviation from the original plan, then it should be negotiated and the cost and time readjusted in the initial stage itself. Ensure that developers are required to prepare, submit and get your agreement on a requirements specifications document
and a design document that would act as base documents in the development. Remember that unclear specifications and scope creep at later stages are the biggest cause of dispute between customers and developers. Ensure that your project is not ruined by such disputes. Testing, removal of bugs, maintenance of the site for at least one month must be a part of the development package.
Contract / agreement If your web site has unique functionality, consider signing a non-disclosure agreement (NDA) and ensure that you receive the source code and own it after it is developed. There should be a contract that specifies
Scope Creep (also called focus creep, requirement creep, feature creep, function creep) in project management refers to uncontrolled changes in a project’s scope. Scope creep is a risk in most projects. Most megaprojects fall victim to scope creep Scope creep often results in cost overrun. A value for free strategy is difficult to counteract and remains a difficult challenge for even the most experienced project managers. This phenomenon can occur when the scope of a project is not properly defined, documented, or controlled. It is generally considered a negative occurrence, and thus, should be avoided. Typically, the scope increase consists of either new products or new features of already approved product designs, without corresponding increases in resources, schedule, or budget. As a result, the project team risks drifting away from its original purpose and scope into unplanned
the terms and specifications. The contract should also clearly specify how the web site would be handed over and training provided should the owner decide to maintain it themselves after the maintenance period. Choosing the right developer or development firm can make or mar your web business initiative. So, take a lot of time at this stage. Ensure that you are convinced about the capabilities of the selected developer and your contracts are detailed and specific. After sign on Your job does not end but actually start after you have chosen a web developer and given them the task of building the business web site. You would be called upon to make decisions or choose from different options as your web site is being built. A good developer tries to clarify these issues in the beginning of the project when the requirements specifications are being written. In any case, you should be sufficiently aware of the issues involved and be able to make an informed choice. Some of the most important issues are the choice of a domain name, scope of your web business, technology platforms, hosting of your site and information design. All these issues and more are covered in detail in Web Business Age. Also check out other blog posts and websites that give you suggestions on branding and e-marketing.
additions. As the scope of a project grows, more tasks must be completed within the budget and schedule originally designed for a smaller set of tasks. Thus, scope creep can result in a project team overrunning its original budget and schedule. If the budget and schedule are increased along with the scope, the change is usually considered an acceptable project addition, and the term “scope creep” is not used. Scope creep can be a result of: • Disingenuous customer with a determined value for free policy • Poor change control • Lack of proper initial identification of what is required to bring about the project objectives • Weak project manager or executive sponsor • Poor communication between parties • Agile software development based on subjective quantifications.
MIindgames
YOUR SPACE A
senior colleague was stranded in traffic due to some VIP movement, in the heavy monsoon downpour. He had a choice of catching up on his emails; conduct business calls, cursing the infrastructure, taxes, government policy, politicians etc. Or he could catch up life, regroup, call an old friend, speak with his extended family or simply listen to music. He decided to make the best of his situation and put on some music, thought about life and then decided to call up his old friends. Friends like him, who were too stuck in traffic at different points. So they decided to get together at appoint on the way and spend some time together. A memory he cherishes as one of the experiences of his life time. A lift, flight, car, train, hotel or a house becomes our living space—while we inhabit it. The time we spend in these spaces can be used for recharging ourselves. When I used to travel in the suburban train, our travel time including waiting for the train used to be 2 hours one way. We boarded and travelled in the same 8:44 local compartment every day. Thus we eventually created a group and made the compartment as important a feature in their life as their home. They would share each other’s stories, play games and even celebrate birthdays! So you get the point.
Long distance journeys Long distance travel can become really tiresome. The only way you can keep oneself fit on such journeys is to keep the mind and body relaxed but charged. Here are some tips: Y Stretch your hands and legs and rotate your ankles and wrists once in a while to regulate the blood flow in your limbs and also be alert and avoid getting cramps and stiff joints. Y If you are taking a flight or a train, keep walking up and down the aisle. If you are on a train, take a light stroll off train when it stops at a junction. Y Consume more liquids than solids during long journeys to avoid indigestion and headaches. Y Keep yourself occupied either by reading or watching good programmes. Y If you are travelling by car or train, take photographs of the scenery or interesting spots on the way. Y If you have to break your long journey, make use of the opportunity to exercise by walking at the airport. Y You can even have a bath, and shop in the meanwhile. You will feel refreshed and recharged for your next journey! Hotel stay Once we reach our destination, the city of our visit and the hotel we are staying in becomes our home for the duration of our stay. It allows us to rest, refresh and recharge after a hectic day of sight-seeing. That is why choosing a good hotel
is imperative, also from a safety perspective. Here’s what you can do to make your stay comfortable: Y Plan your trip in advance. It gives you enough time to look for good, clean hotels and strike good deals. Y Make sure that your hotel is in a good locality with easy access to the market place, bus stop and restaurant. It shouldn’t feel like a strain to get back there or like a prison with no proper transport to get you out. Y Check whether your hotel room has good ventilation, clean toilets and comfortable beds. Y Ensure that it offers at least a decent if not spectacular view. Looking at the backside of other buildings when away from home is depressing. Y Select a room that allows for privacy. But make sure it isn’t too isolated for safety reasons. Y For the sake of your own hygiene, use your own toiletries and towels. Y Open the windows and air the place as soon as you occupy it to get rid of the unpleasant odors or other contaminants in the room. Y Insist that your bedding be cleaned and replaced every single day of your stay. Y Ensure that the laundry is freshly done. Finally, most importantly, remember ‘Your life is for you to make!’ to be good or stressed.
Grooming
Do you know what it feels like to sleep well at night and be wide awake, creative and dynamic all day long? If you’re like most people, you probably don’t. Try these simple steps to improve your quality of sleep
M
ost of people don’t value sleep. It is considered a luxury rather than a necessity and, as a result, as acheivers most people aren’t willing to adjust schedules to get adequate rest. Do you know what it feels like to sleep well at night and be wide awake, creative and dynamic all day long? If you’re like most people, you probably don’t. Try these simple steps to improve your quality of sleep.
Determining [PSQ] Personal Sleep Quotient The optimum amount of sleep your body needs to function at its best. Failing to reach your personal sleep requirement diminishes concentration, productivity and work quality. If we operated machinery the way we’re driving our bodies, we’d be guilty of reckless endangerment. After 17 – 19 hours without sleep, your brain activity is similar to someone with a blood alcohol content [BAC] of 0.05 [0.08 being the legal limit for intoxication in most countries].
How to determine your PSQ Pick a bedtime when you’re likely to fall asleep quickly— that’s at least eight hours before you need to get up. Keep to this bedtime for the next week and note when you wake up each morning. You might rise early for a few days if you’re used to sleeping less, that habit will soon give way to longer rest. If you need an alarm to wake up, if it’s difficult to get out of bed, or if you’re tired during the day, eight hours isn’t enough for you. Move your bedtime up by 15 – 30 minutes the next week. Continue doing this each week until you awaken without an alarm and feel alert all day. When you determine what you think is your ideal bedtime, cut 15 minutes off it to see if you’re sleepy the next day. If so, then you’ve nailed your PSQ. Add those 15 minutes back, and you’re set. Most adults require 7.5 – 9 hours of sleep to be fully awake and energised all-day long. As a rule of thumb, you’ll probably have to add one more hour to your current sleep schedule. Go to bed and wake up at the same time every day. Every
By JAMES Maas, REBECCA Robbins in Sleep Centre
day means seven days a week, 365 days a year—regularity is vital for setting and stabilising your body’s biological clock. It only takes a few weeks to fully sync the hours you spend in bed with the sleepy phase of your clock. When this happens, you won’t need an alarm clock to wake you up and the hours you spend awake will correspond to when you feel most alert and refreshed. By sticking to a schedule, you’ll be significantly more alert than if you slept for the same total amount of time at varying hours during the week. And eventually, such regularity will reduce the total sleep time required for maximum daytime alertness. Yes, a regular sleep routine will enable you to do just as well on a little less sleep. Sleep researchers and scientists at the Harvard Medical School found that by altering your sleep schedule by even a few hours, mood deteriorates. Shift-workers in particular experience more anxiety and depression partly because they’re out of sync with their biological clocks.
Sleep in one continuous block Sometimes it’s impossible; any new parent or an older guy with prostate woes will tell you so. But so-called ‘fragmented sleep’—even for hours—is not physically or mentally restorative and causes daytime drowsiness. It also dramatically compromises learning, memory, productivity and creativity. In fact, six hours of continuous sleep is more restorative than eight hours of fragmented sleep. Some people anticipating a night of fragmented sleep often go to bed early hoping to manage eight hours of total sleep within a 10-hour period. But that’s a waste of time. So, don’t let yourself doze on and off for hours. Limiting your time in bed to your PSQ, and not a minute [or 20] more, will eventually bring greater benefits. Many people use snooze bars thinking that they’ll get an extra hour of sleep after the first alarm goes off. Wrong! If you set the alarm to ring every 15 minutes for an hour, at best, you might get 18 – 20 minutes worth of fragmented sleep. It’s much better to go to bed one hour earlier and wake up naturally.
Make up for your lost sleep as soon as possible Every hour that you are awake you are building sleep debt. Every 2 hours of wakefulness requires 1 hour of sleep. It’s a 2:1 ratio. That is why the general rule is that after 16 hours of being awake, you need 8 hours of sleep. When this rule is violated, sleep debt accumulates quickly. Before long, you’ll crash [hopefully not on the road], get sick or perform poorly.
How to make up for lost sleep Don’t try to replace it all at once. If you skipped a night, don’t try to sleep for 14 – 16 hours the next night. That’s just about impossible because your long-established biological clock is pre-programmed to put you to sleep and wake you up at a set time every day. Instead, apportion your sleep debt out over the next few days until you feel better. Catch up on lost sleep by going to bed earlier than usual, not by sleeping late. If you sleep late, you’ll find it harder call asleep the following night at the usual hour. Don’t try to make up for large sleep losses during the week by sleeping in on the weekend. This is like trying to get fit or lose weight by doing all your exercising or dieting on Saturdays and Sundays. Your brain doesn’t have a separate biological clock for weekends. Changing your sleep/wake times disturbs your body’s natural rhythm. If you sleep till noon on Sunday, for instance, you won’t be very tired, come your regular bedtime. Maybe you’ll doze off sometime after midnight, but just a few short hours later, your alarm will jerk you back to consciousness and you’ll have to crawl to work with the Monday morning blahs. You’ll have induced jet lag without leaving your zip code. Try napping to pay back your sleep debt. However, be careful not to nap too long or too late in the day, or you’ll further disturb your sleep cycle. Whenever your sleep is significantly disturbed, return to your regular schedule as soon as possible. For years of accumulated sleep debt, it may take as long as 4 – 6 weeks until you discipline your sleep. But the resulting alertness, mental and physical performance, and enjoyment of life will be more than worth the discipline it took to get there. In sum, determine and meet your PSQ, establish a regular bedtime schedule, get one long block of continuous sleep, and be sure to make up for lost sleep. As you can see, the cure for sleep loss is painless and pleasurable. All it takes is just a little discipline. © 2010, Dr James B Maas & Rebecca Robbins excerpted from, “Sleep for Success”
IMPACT OF SLEEP DISORDERS
Grooming
In order to be properly focused and display proper job performance, a person needs to have an ample amount of sleep the night before. However, there are a number of common sleep disorders that can disrupt this sleep, causing people to have trouble staying focused and alert while on the clock. In certain professions, one must constantly be awake and alert in order to maintain the safety of himself or herself, as well as other people that may be affected by the person’s lack of sleep. Three of the most common sleep disorders that affect job performance are restless legs syndrome, sleep apnea, and insomnia. Restless legs syndrome is one sleep disorder that can severely impact job performance. This condition often causes a person to have pain in their arms and legs, causing them to lose sleep due to the discomfort. Because of the pain, people tend to toss and turn a lot because they feel the urge to move their limbs. While this can happen at anytime, it often occurs when a person is trying to sleep, causing them to feel tired the next day. Medication can be prescribed by a doctor to help with the condition. Another common sleep disorder that can affect job performance is sleep apnea. Sleep apnea is when a persons airway closed while a person is asleep. This is due to the fact that a person with this disorder’s muscles relax too much. When this occurs, it can feel like a person is choking and not receiving enough air. This, in turn, causes a person to wake up during the night and can occur multiple times throughout the night. The next day, a person may feel as though he or she has had little or no sleep, severely impacting their job performance. Insomnia is yet another common sleep disorder that can affect job performance. This condition is categorized by the fact that a person simply cannot fall asleep over a long period of time. There are a number of reasons why a person may develop insomnia, but they all cause a person to not get enough rest. Like with other sleep disorders, it can be extremely detrimental to a person’s job performance.
NATURAL SOLUTIONS TO SLEEP DISORDERS If you’re struggling to fall asleep each night or have trouble staying asleep when you do, you’ve probably tried over-the-counter sleep aids. We would like to suggest 5 natural processes to beat insomnia.
Dry skin brushing: Wait. What? Yup. Dry skin brushing. Dry skin brushing is a natural method used to detoxify the body through gentle massage with a dry, long-handled, soft-bristled brush. How is it a natural treatment for insomnia? Dry skin brushing removes toxins from the largest organ in our body: the skin. Our bodies are filled to the brim with over-processed food, chemicals found in our cosmetics and environment pollutions. Ridding your body of these toxins will have a surprisingly relaxing, calming effect on the mind and help rebalance your natural sleep-wake cycle. For more on dry skin brushing, check out my article here. Epsom salt bath: Epsom salt is not really salt, it’s magnesium sulfate and it’s responsible for the regulation of enzymes in the body, which promote proper nervous system function. A nice, hot soak is a natural treatment for insomnia that will sooth your restless mind and help you fall asleep. White noise: It’s possible you’re waking in the middle of the night because your neighbor works odd hours, there’s a train passing through a block away or an appliance is kicking on at certain times. Though asleep, our brains are still wired to pick up any sound that sounds “unusual”. It’s a natural defense mechanism we picked up in our cave-man days when we still had to worry about being eaten. A desktop fan left on all night may be just the natural treatment for insomnia you’re looking for. Self hypnosis: Hypnosis for insomnia relief might sound a bit unusual. Settle comfortably into bed on your back if you can. Close your eyes and take five deep breaths. Then, breathe normally. Allow your arms and legs to get heavy, relax your shoulders and let your lips part slightly with your tongue just touching the roof of your mouth. Say to yourself, “I am peacefully, deep asleep” over and over. What may happen is your body will relax deeply first and your mind may still be awake. Though it’s a bit disconcerting, keep breathing and relaxing and you should ease into a peaceful sleep. This natural treatment for insomnia may take a few days to work through the night. Be patient. Food allergy elimination: If you have a hidden food intolerance or allergy, anxiety and insomnia are common symptoms. The best thing you can do if you suspect you have a food intolerance or allergy is to do an elimination diet and reintroduce foods slowly to test for a reaction. The most common food intolerances/allergies are wheat, gluten, dairy, corn, soy, egg, citrus, nuts and seafood. Eliminate all of these from your diet for at least one week before reintroducing one new food each day. This simple method may offer you both the cause and the solution to your chronic insomnia!
Human cAapital
By. L M Kamble
A
person’s mind is their most powerful tool. Yet very few people take intentional steps toward “upgrading” their brain and trying to become smarter. Here are some scary statistics from an article published in The Economist: g In 1991 a worker with a bachelor’s degree earned 2.5 times over a highschool drop out. g In 2010, a worker with a bachelor’s degree earned 300% as much as a high-school drop out. There is an obvious trend toward paying people who have “upgraded their brain” with more money. This probably isn’t too surprising, but consider this: g 42% of people who graduate from college never read another book. 42%!? That says that a good number of people get out of college and just assume they have arrived–no need to work on getting any smarter. Obviously there are ways to learn other than reading. Books have traditionally been and still are one of the main ways you acquire formal knowledge. If you are not reading, it is very unlikely you are growing. It is even less likely that you are actually getting smarter in ways that have value outside of the weekly tasks. There are 7 general ways to upgrade: 1. Read 2. Get a degree 3. Seek out new experiences 4. Think 5. Practice 6. Write 7. Engage in difficult assignments
Read Reading is the primary way education is done. All successful people read, hence all want to be successful need to read. Reading is the fundamental
bedrock of upgrading your brain and becoming smarter. You have to read regularly. Not all reading methods and not all reading contents are equal. There is a very big difference between reading on a computer and reading a physical book. A few months ago there was a study that compared reading on an iPad or Nook to reading a book and the researchers found that people remembered less when reading from the iPad. It had something to do with the way we perceive a lighted surface v/s. a reflective one. Perhaps, lighted surfaces are associated with Cinema, TV or digital displays which are used traditionally for entertainment and are less engaged. Reading on the internet is also quite different from reading a book. A book presents a clear start and end point. There are more barriers to publishing a physical book than getting something up on the web. Chances are a book will have more thought behind it than an article published on a web page. In addition, it is much easier to jump from place to place on the web so Internet articles don’t typically require or inspire the same level of concentration as what you might need for an intense book. Reading from the Internet is less engaging medium as there is a lot of distractions. The web is an incredible tool and gives users access to information that would have been impossible in the past. However, there is a need to take care to not let it crowd out traditional reading. Use the internet for things that the internet is good for and use books for things books are good for. The internet is great for looking up a single fact–something that can take a very long time with a book. Books are great for deep studying of a
subject. So what should you read? 1. Classics 2. Related to your area of interest & expertise needed for competitiveness. 3. Books on topics from a completely different field to open up your mind. This type of approach will help make sure you are getting a well rounded reading experience that helps prepare you for today & tomorrow. If all of your reading falls outside of these four categories, you probably are reading more for entertainment.
Get a(nother) degree If you don’t have a college degree, get one. However, not every degree is equal. You can get a diploma without necessarily learning very much just like you can become very smart without getting a diploma. You need two things from a degree: 1. The recognition that comes from having a formal college degree. 2. Knowledge that comes from having worked hard at an academic pursuit. If you already have a degree, the same thing applies. Get another one. To be competitive in today’s job market, most people are going to need training and recognition that comes from studies beyond the bachelors level. Usually a master’s degree is a good choice, but there are graduate certificate and citation programs that can be excellent options. Even if you are pursuing a master’s degree, a graduate citation can be an excellent stepping stone that gives you a way to quantify your education as you pursue your master’s degree.
Seek out new experiences Our brains grow when we do something new with them. If you aren’t doing anything new, your brain is not growing. Reading new books, studying
new topics, going back to get another degree are all things that can help give your brain new experiences. But what about more mundane things? Here are some simple things you can do that will help give you new experiences. 1. Brush your teeth with your nondominant hand a few times each week. 2. Read a normally unread section of the newspaper or a magazine. 3. Go into a store that you’ve never had any desire to visit. 4. Draw pictures. 5. Draw pictures with your nondominate hand. 6. Travel to work using a different route or method. 7. Experiment and cook a type of food you’ve never made before. 8. Watch a few movies that are in a different language. 9. Attend a lecture on a topic you know nothing about. 10.Spend a few hours in municipal court as an observer. 11. Attend a city commission meeting. 12. Go to a restaurant that is primarily frequented by people who aren’t in your age group. 13. Learn to juggle. (Highly recommended) 14. Try non-standard work. 15. Strike up conversations with people you normally wouldn’t talk to. 16. Visit a library you’ve never been in. 17. Browse a section of a library that you’ve never been in. 18. Attend an art display in a style you don’t particularly care for. 19. Attend musical recitals for different instruments and a modern composer. 20. Take the stairs in a building where you’ve only taken the elevator. 21. Listen to a different radio station. 22. Spend some time reading in the room in your house where you spend the least amount of time. 23. If you have land or a yard, go stand in part of it where you don’t think you’ve ever been before. 24. Try out a different (OS) operating system. (Many can run from a CD) 25. Go to a school board meeting. 26. Go star gazing. 27. Write a letter to someone you’ve never written to before. 28. Ask an older relative about the things they remember at your age None of those activities are likely to be life changing. However, each one will change you just a little bit and each one will give you brain something new to think about and process.
don’t do this is because it usually just becomes day dreaming. Day dreaming isn’t necessarily a bad thing, but it isn’t as directed as what we are trying to achieve by sitting and thinking. The funny thing about thinking is that there really isn’t that much information on how to go about doing it. There are books like How To Think Like Leonardo da Vinci that are interesting but tend to focus on how to be creative less than on how to just think. On one hand this is disappointing, but on the other it makes sense. Thinking is a huge category and it is going to be very difficult for one person to explain how they think to someone else.
Here are some guidelines for productive thinking that works. 1. Decide what to think To be really productive your thinking needs to be directed. Here are some things you might want to spend some time thinking about: Your career plans and how to get the most out of your current job. A business idea. Personal goals – clarifying what you want to achieve and life and how to reach those achievements.
Human capital really accomplished anything. ”Come up with 3 ideas for a business I can run from home” is a bit easier to claim success.
4. Take notes This may sound funny. Why would you take notes of your thinking? Getting something down on paper lets you see your though process much more easily than when it is just in your mind. Thinking is the process of interacting with information and getting some of that information out in front of you is a great way to focus and be creative. These don’t need to be formal notes. You can jot ideas, draw diagrams, doodle pictures or create mind maps to help clarify what you are thinking.
Practice
Musicians and sports figures constantly practice, but most other people never practice. If you can find a way to practice your skills, you can become better at what you do. Practice can make you faster, more efficient and better at your job. The trick is to find a small unit that you can repeat in a way that will increase your skill. Here are some ideas of things you might be able to practice: If you are slow at typing, practicing typing Public speaking is something that can be practiced and good presentation skills are essential to many careers. Writing is a skill that can be practiced. Few people wouldn’t benefit from being able to write a bit better.
Skill-based training can positively affect your employees and thereby your company. Increases employee productivity. Improves job satisfaction. Write Aids in the recruiting process. Writing is underrated. The discipline Rewards long-time employees. of getting thoughts from your head onto paper is very valuable and you Develop a can-do attitude. 2. Find a quiet place What qualifies as a distraction is going to be different for different people and may vary depending on what you are thinking about. A distraction free environment for clarifying your personal goals might be a coffee shop, but if you are working on coming up with a mathematical theorem, the same coffee shop might be full of distractions.
3. Write down what you hope to accomplish
Without a plan you won’t know if you accomplished what you set out to do. Get it down on paper to make sure you are clear what you want to get out of this “thinking session.” Your goal can be as specific or as general as Think necessary, but try to choose something We think all the time, but most of us that you can tell if you succeeded don’t spend any structured, intentional time just thinking. We think just enough or not. Writing down “think about businesses” isn’t something that you to start our next action. There is great can really quantify as having done or value in taking the time to deliberately not–or at least it is hard to tell if you sit and think. One of the reasons we
can learn a lot simply by writing down your ideas and observations. Writing is the process of making your thoughts concrete and visible. it allows you to clarify what you are thinking and refine your ideas. Writing makes you smarter because it forces you deeper into a topic and shows you areas of your topic that you don’t fully understand.
Engage in difficult assignments Doing things that are difficult raises your ceiling and increases your capabilities. Football coaches practice with small goals during practice. This makes it a lot harder for players to score during practice, but when the game comes and they are practicing on a normal sized hoop it seems much easier to make shots. They make practice harder in order to raise the bar on their performance when it really matters. In some ways, this suggestion sounds like the suggestion to find things to practice and there is some overlap. However, doing things that are hard can involve doing big projects and larger
Human capital scale work than finding something small that you can practice over and over again. If tackle writing a 100 page research paper, the 5 page papers you are subsequently assigned will seem trivial in comparison. If you want your brain to be operating at its peak capabilities, you need to constantly be asking yourself, “When was the last time I did something where I felt truly challenged? When was the last time where I was seriously worried that I might fail?” If you haven’t had any of those experiences recently, you may need to seek out a difficult assignment or project in order to make sure your brain isn’t becoming stagnant.
Conclusion Your brain is your most valuable asset. Many people leave their brain’s development up to chance. If you want to safeguard against becoming stale and irrelevant you need to make a conscious effort to upgrade your brain, develop your skills and insure that you are moving forward–not backwards. Every organization needs to make the effort to evangalize skill improvement. Skill-based training can positively affect your employees and your company. Training increases employee productivity.
Your brain is your most valuable asset. Many people leave their brain’s development up to chance. If you want to safeguard against becoming stale and irrelevant you need to make a conscious effort to upgrade your brain, develop your skills and insure that you are moving forward–not backwards. In addition to learning how to complete new tasks and take on more responsibility, employees can learn advanced techniques to help them complete everyday tasks more efficiently. For example, sending your bookkeeper to an advanced Excel class may help him or her learn shortcuts to simplify the accounting processes. Training reduces turnover. Employees who don’t receive guidance or have difficulty learning the ropes are much more likely to leave your company. Employees are less likely to leave if they have the opportunity to learn new skills and keep up within their industry. Training improves job satisfaction. Investing time and money in employees’ skills makes them feel valued and appreciated, and it challenges them to learn more and get more involved in their jobs. Higher job satisfaction ultimately results in reduced turnover and higher productivity. Training aids in the recruiting process. If you’re committed to training, you’ll be more willing to hire a desirable candidate who lacks a specific skill. Training also makes your company more attractive in the eyes of potential employees because it shows them that they have room to grow and accept new challenges. In addition, training existing employees could reduce the need to hire new staff. Training rewards longtime employees. You’ll be more willing to promote existing employees who have learned new skills and are ready to take on new challenges. Training reduces the need for employee supervision. Not only does skill-based training teach employees how to do their jobs better, but it also helps them work more independently and develop a can-do attitude.
The author, Mr. L M Kamble is Ex-GM, RBI, He is currently with PMC Bank, heading the HR function. He may be reached on laxmank@hotmail.com
ENTITLEMENTS 1 free holiday reservation for 2 nights from a list of resorts 100 points on sign-up Refer friends and gain points into your account to buy luxury products of your choice. Write articles and multiply points from readers reading your feature e-Wallet application for point conversion
A
lthough individual employees might differ on how they define the perfect boss, effective leaders share similar characteristics in their communication and management styles. These include a combination of leadership abilities, communication skills and professional expertise.
Leadership and Respect
o The perfect boss has strong leadership skills. She can inspire her workers, provide them meaningful guidance and garner the respect of her subordinates.
People Skills
Bosses must have strong people skills. They must be able to communicate effectively with employees from diverse backgrounds and remain diplomatic when dealing with sensitive issues.
Authority
Effective bosses command authority and are able to make difficult decisions that are in the best interest of the company.
Knowledge
The perfect boss does not need to be an expert in his field, but he should have a clear understanding of the work each of his employees performs.
Attitude
Ideal bosses are open-minded and take their employees’ opinions and ideas into account when making decisions for the company.
The Visionary Characteristics: Creates a reality-distortion field that makes people believe the improbable. Commonly found in high-tech and biotech habitats. Plumage: Nike Air Jordan 7 Retro running shoes Archetype: Steve Jobs Quote: “Seriously, this technology is going to change the world.” Pros: Can be highly inspirational, especially if you’ve got founder’s stock. Cons: Yells at people who don’t share his vision 24/7. Warning: You will not have a life. Care and Feeding: Drink the Kool-Aid. Once you decide that the career points are worth the long hours, throw yourself into the work as if your job is the only thing that matters.
The Climber
Characteristics: Desperately wants to get to the top. Plumage: Armani Collezioni wool suit Archetype: Nicolo Machiavelli Quote: “Let’s run that up the flagpole and see who salutes.” Pros: May create an opening for you if and when he’s promoted. Cons: You’re nothing but a rung on his ladder to success. Warning: Will dump you like a month-old mackerel if you make him look bad. Care and Feeding: Make sure the boss knows you understand that your job is to make him look good. Watch his back and feed him tidbits from the corporate grapevine.
The Bureaucrat
Characteristics: Believes the world would fall apart without rules and regulations. Found at government agencies, defense contractors and most Fortune 100 companies. Plumage: White dress shirt, dark tie. Archetype: Dilbert’s PHB (Pointy-Haired Boss) Quote: “If it ain’t broke, don’t fix it.” Pros: Highly predictable and thus easy to manipulate. Cons: Rendered totally ineffective by any major change in the business environment. Warning: Can grind your creativity to dust. Care and Feeding: Learn to love red tape. Keep all activities within the context of what’s been done in the past, whether it worked or not. Document everything and share that documentation with everyone.