December 2019
“The The only way of discovering the limits of the possible is to venture a little way past them into the impossible.� impossible. - Arthur C. Clarke
“In a chronically leaking boat, energy devoted to changing vessels is more productive than energy devoted to patching leaks.� - Warren Buffett
Contents December 2019
GIA Recognition
ZIF AIOps app
09
10
Global Industry Analysts, Inc. identifies GAVS as one among the top global competitors in the AIOps Platform market!
GAVS is Blazing the Trail Launching the first-ever Native Mobile Version for an AIOps Platform!
GAVS Technologies has been recognized as a Competitor in their list of Key Global Players in the AIOps Platform market.
GAVS is excited to announce that ZIF has now gone Mobile! We are proud to say that ZIF is currently the only AIOps Platform in the market to have a native mobile version! Look for ZIF AIOps on the App Store and Google Play Store.
Contents December 2019
Digital Dirt
SDN
11
13
Cleaning up our Digital Dirt
Software Defined Networking (SDN)
Now, what exactly is digital dirt, in the context of enterprises? It is highly complex and ambiguous to precisely identify digital dirt, let alone address the related issues. Chandra Mouleswaran S, Head of Infra Services at GAVS Technologies says that not all the applications that run in an organization are actually required to run.
As per Open Networking Foundation’s definition, Software-Defined Networking is the physical separation of the network control plane from the forwarding plane and where a control plane controls several devices.
Authors:
Authors:
Sri Chaganty Chandra Mouleswaran S
Chandrasekar Balasubramanian Suresh Ramanujam
Contents December 2019
SQL
Deepfakes
16
19
The shifting pH of Databases from ACID to BASE
Deepfakes - Another reason for you not to believe everything you see on the internet!
They say data is the new oil. I would add that data is the new gold. Data is one of the driving forces of Industry 4.0, the next phase of the Industrial Revolution. However, ‘Big Data’ is a jargon that has plateaued in the last couple of years; bigger isn’t always better, Big insights are more important.
An episode of the latest season of the speculative fiction series, Black Mirror, explored the mounting risks of advanced technology in the entertainment industry. The episode depicted a pop star being replaced by her digital avatar.
Author:
Author:
Bargunan Somasundaram
Soundarya Kubendran
Contents December 2019
Deep Learning
Customer Leverage Strategy
21
23
A Deep Dive into Deep Learning!
Do You Have a Strategy for Leveraging Your Key Customers?
The Nobel Prize winner & French author André Gide said, “Man cannot discover new oceans unless he has the courage to lose sight of the shore”. This rings true with enterprises that made bold investments in cutting-edge AI that are now starting to reap rich benefits.
Have you ever thought about what would happen if the top 10% of your happy customers went out and told your other customers and prospects how great your company is? Have you ever considered the impact of having your greatest advocates out in force to help tell your story?
Authors:
Author:
Padmapriya Sridhar Kathakali Basu
Betsy Westhafer
Contents December 2019
Customer Centricity
25
Demystifying Customer Centricity - A simplistic take Vidyarth explores the subject of customer centricity and how he has related to that subject across personal experiences and shining examples of organizations that are excellent at enabling customer centricity and what it means for us at GAVS.
Author: Vidyarth Venkateswaran
In this edition, we bring you an article by our CTO, Sri Chaganty and our Head of IP & Infra, Chandra Mouleswaran S, “Cleaning up our Digital Dirt”. They write, “…not all the applications that run in an organization are actually required to run. The applications that exist, but not used by internal or external users or internal or external applications contribute to digital dirt.”
Editor’s Notes
Chandrasekar and Suresh have written, “Software Defined Networking (SDN)”. They write, “SoftwareDefined networking (SDN), makes networks agile and flexible. It provides better network control and hence enables the cloud computing service providers to respond quickly to ever changing business requirements.” Bargunan has written, “The shifting pH of Databases from ACID to BASE”. He writes, “They say data is the new oil. I would add that data is the new gold. Data is one of the driving forces of Industry 4.0, the next phase of the Industrial Revolution.”
Soumika Das
Soundarya has written, “Deepfakes – Another reason for you not to believe everything you see on the internet!” She writes, “Deepfake is a technology that uses deep learning technology to fabricate entirely new scenes or alter existing videos. Although face-swapping has been prevalent in movies, they required skilled editors and CGI experts.”
A recent global survey by one of the bigger consulting firms has revealed that while the adoption of AI among organizations is on the rise, managing risks and workforce retraining remain the main obstacles for the future.
Priya and Kathakali have written, “A Deep Dive into Deep Learning!”. They write, “Artificial Intelligence is shredding all perceived boundaries of a machine’s cognitive abilities. Deep Learning, at the very core of Artificial Intelligence, is pushing the envelope still further into unchartered territory.”
The Uber self-driving car accident in the US further reinforces this finding. It was found that Uber hadn’t programmed its autonomous vehicles to react to jaywalking pedestrians. However, further developments in the case led to the blame being pinned on the back-up driver. Turns out that the driver was watching a reality show on her mobile device, leaving her unable to apply the brakes at the right moment.
Betsy Westhafer has written, “Do You Have a Strategy for Leveraging Your Key Customers?” She writes, “Have you ever thought about what would happen if the top 10% of your happy customers went out and told your other customers and prospects how great your company is?”
Experts believe that for AI-driven organizations, ad-hoc training approaches will not suffice. The change should reflect in the culture and DNA of organizations. This brings me to another interesting part of the survey. Most respondents said AI might decrease their workforce but many felt it will increase. More importantly, it was found that implementing AI will shift the workforce across functions. Thus, the focus should be on appropriate training, reskilling and upskilling, instead of resisting the change that is bound to happen.
Happy Reading!
It has been yet another exciting month at GAVS with us being recognized as one of the top global competitors in the AIOps Platform market by Global Industry Analysts, Inc. We’ve also blazed the trail by launching the firstever native mobile version for an AIOps Platform.
Vidyarth has written, “Demystifying Customer Centricity - A simplistic take”. He writes, “I am among those many who are intrigued by how some organizations can seamlessly surpass expectations and deliver a great customer experience.”
Global Industry Analysts, Inc. identifies GAVS as one among the top global competitors in the AIOps Platform market!
In the report ‘AIOps Platform - Market Analysis, Trends, and Forecasts’, Global Industry Analysts, Inc. (GIA) - a leading publisher of off-the-shelf market research - has analyzed and sized the Large Enterprises segment in the AIOps Platform Market worldwide. GAVS TECHNOLOGIES is recognized as a Competitor in their list of Key Global Players. The report says that these organizations have the potential to grow at over 31.3% and will add significant momentum to global growth of the AIOps Platform market which is poised to reach over US$9.2 Billion by the year 2025. https://www.strategyr.com/market-report-aiops-platform-forecasts-global-industry-analysts-inc.asp
9
GAVS is Blazing the Trail Launching the first-ever Native Mobile Version for an AIOps Platform!
GAVS is excited to announce that ZIF has now gone Mobile! We are proud to say that ZIF is currently the only AIOps Platform in the market to have a native mobile version! Look for ZIF AIOps on the App Store and Google Play Store. GAVS’ Zero Incident FrameworkTM (ZIF) is an AIOps-based TechOps Platform that enables organizations to trend towards a Zero Incident EnterpriseTM. ZIF comes with an end-to-end suite of tools that ITOps teams need. It is a pure-play AI Platform powered entirely by Unsupervised Pattern-based Machine Learning!
10
Cleaning up our Digital Dirt
Are we prepared to clean the trash? The process of eliminating digital dirt can be cumbersome. We cannot fix what we do not find. So, the first step is to find them using a specialized application for discovery. Chandra further elaborated on the expectations from the ‘Discovery’ application. It should be able to detect all applications, the relationships of those applications with the rest of the environment and the users using those applications. It should give complete visibility into applications and infrastructure components to analyze the dependencies. Shadow IT Shadow IT, the use of technology outside the IT purview is becoming a tacitly approved aspect of most modern enterprises. As many as 71% of employees across organizations are using unsanctioned apps on devices of every shape and size, making it very difficult for IT departments to keep track. The evolution of shadow IT is a result of technology becoming simpler and the cloud offering easy connectivity to applications and storage. Because of this, people have begun to cherry-pick those things that would help them get things done easily. Shadow IT may not start or evolve with bad intentions. But, when employees take things into their own hands, it is a huge security and compliance risk, if the sprawling shadow IT is not reined in. Gartner estimates that by next year (2020), one-third of successful attacks experienced by enterprises will be on their shadow IT resources.
Sri Chaganty
Chandra Mouleswaran S
Now, what exactly is digital dirt, in the context of enterprises? It is highly complex and ambiguous to precisely identify digital dirt, let alone address the related issues. Chandra Mouleswaran S, Head of Infra Services at GAVS Technologies says that not all the applications that run in an organization are actually required to run. The applications that exist, but not used by internal or external users or internal or external applications contribute to digital dirt. Such dormant applications get accumulated over time due to the uncertainty of their usage and lack of clarity in sunsetting them. They stay in the organization forever and waste resources, time and effort. Such hidden applications burden the system, hence they need to be discovered and removed to improve operational efficiency.
The Discovery Tool IT organizations should deploy a tool that gives complete visibility of the landscape, discovers all applications – be it single tenant or multi-tenant, single or multiple instance, native or virtually delivered, on-premise or on cloud and map the dependencies between them. That apart, the tool should also indicate the activities on those applications by showing the users who access them and the response times in real-time. The dependency map along with user transactions captured over time will paint a very clear picture for IT Managers and might bring to light some applications and their dependencies, that they probably never knew existed! Discover, is a component of GAVS’ AIOps Platform, Zero Incident FrameworkTM (ZIF). Discover can work as a stand-alone component and also cohesively with the rest of the AIOps Platform. Discover provides Application Auto Discovery and Dependency Mapping (ADDM). It automatically discovers and maps the applications and topology of the end to end deployment, hop by hop. Some of its key features are: •
Zero Configuration The auto-discovery features require no additional configuration upon installation.
•
Discovers Applications It uniquely and automatically discovers all Windows and Linux application in your
11
environment, identifies it by name, and measures the end-to-end and hop-by-hop response time and throughput of each application. This works for applications installed on physical servers, in virtualized guest operating systems, applications automatically provisioned in private or hybrid clouds, and those running in public clouds. It also works irrespective of whether the application was custom developed or purchased. •
Discovers Multitenant Applications It auto-discovers multitenant applications hosted on web servers and does not limit the discovery to the logical server level.
•
Discovers Multiple Instances of Application It auto-discovers multiple instances of the same application and presents them all as a group with the ability to drill down to the details of each instance of the application.
•
Discovers SaaS Applications It auto-discovers any requests directed to SaaS applications such as Office 365 or Salesforce and calculates response time and throughput to these applications from the enterprise.
•
•
Discovers Virtually Delivered Applications or Desktops It automatically maps the topology of the delivered applications and VDIs, hop-by-hop and endto-end. It provides extensive support for Citrix delivered applications or desktops. This visibility extends beyond the Citrix farm into the back-end infrastructure on which the delivered applications and VDIs are supported. Discovers Application Workload Topologies The architecture auto-discovers application flow mapping topology and user response times to create the application topology and update it in near real-time — all without user configuration. This significantly reduces the resources required to configure service models and operate the product.
•
Discovers Every Tier of Every Multi-Tiered Application It auto-discovers the different tiers of every multitiered application and displays the performance of each tier. Each tier is discovered and named with the transactional throughput and response times shown for each tier.
•
Discovers All Users of All Applications It identifies each user of every application and the response time that the user experiences for each use of a given application.
•
Discovers Anomalies with Applications The module uses a sophisticated anomaly detection algorithm to automatically assess when a response time excursion is valid, then if a response exceeds normal baseline or SLA performance expectations, deep diagnostics are triggered to analyze the event. In addition, the hop-by-hop segment latency is compared against the historical norms to identify deterministically which segment has extended latency and reduced application performance.
For more detailed information on GAVS’ Discover, or to request a demo please visit https://zif.ai/products/discover/ About the Authors: Chandra Mouleswaran S: Chandra heads the IMS practice at GAVS. He has around 25+ years of rich experience in IT Infrastructure Management, enterprise applications design & development and incubation of new products / services in various industries. He has also created a patent for a mistake proofing application called ‘Advanced Command Interface”. He thinks ahead and his implementation of ‘disk based backup using SAN replication’ in one of his previous organizations as early as in 2005 is a proof of his visionary skills. Sri Chaganty: Sri is a Serial Entrepreneur with over 30 years’ experience delivering creative, client-centric, valuedriven solutions for bootstrapped and venture-backed startups.
12
Introduction As per Open Networking Foundation’s definition, Software-Defined Networking is the physical separation of the network control plane from the forwarding plane and where a control plane controls several devices.
Software Defined Networking (SDN)
In a traditional network architecture, individual network devices make traffic decisions (control plane) and forward packets/frames from one interface to another (data plane). Thus, they have all functions and processes related to both control plane and data plane. But in Software-Defined Networking, the control plane and data plane are decoupled. The control plane is implemented in software which helps the network administrator to manage the traffic programmatically from a centralized location. The added advantage is that individual switches in the network do not require intervention of the network administrator to deliver the network services. Software-Defined networking (SDN), makes networks agile and flexible. It provides better network control and hence enables the cloud computing service providers to respond quickly to ever changing business requirements. In SDN, the underlying infrastructure is abstracted for applications and network services. SDN architecture A typical representation of SDN architecture includes three layers: the application layer, the control layer and the infrastructure layer.
Chandrasekar Balasubramanian
Suresh Ramanujam
The SDN application layer, not surprisingly, contains the typical network applications or functions like intrusion detection systems, load balancing or firewalls. A traditional network uses a specialized appliance, such as a firewall or load balancer, whereas a softwaredefined network replaces the appliance with an application that uses the controller to manage the data plane behaviour. SDN architecture separates the network into three distinguishable layers, i.e., applications communicate with the control layer using northbound API and control layer communicates with data plane using southbound APIs. The control layer is considered as the brain of SDN. The intelligence to this layer is provided by centralized SDN controller software. This controller resides on a server and manages policies and the flow of traffic throughout the network. The physical switches in the network constitute the infrastructure layer.
13
Some Examples of NFV •
Virtualized Network Appliances, where dedicated network devices are replaced by virtual machines running in servers.
•
Virtualized Network Services/functions (VNFs), which virtualizes software-based network monitoring and management services, including traffic analysis, network monitoring and alerting, load balancing and quality or class of service handling
Benefits of SDN from networking architecture perspective How SDN works SDN is internally an orchestration of several technologies. Network virtualization and automation using well defined APIs are the key ingredients. Functional Separation adds value by decreasing dependencies. In a classic SDN scenario, a packet arrives at a network switch, and rules built into the switch’s proprietary firmware tell the switch where to forward the packet. These packet-handling rules are sent to the switch from the centralized controller. The switch — also known as a data plane device — queries the controller for guidance as needed, and it provides the controller with information about traffic it handles. All the packets destined for same host are treated in a similar manner and forwarded along the same pathway by the switch. The virtualization aspect of SDN comes into play through a virtual overlay, which is a logically separate network on top of the physical network. In order to segment the network traffic, end-to-end overlays can be implemented. Thus, users can abstract the underlying network as well. This micro-segmentation is especially useful for service providers and operators with multitenant cloud environments and cloud services, as they can provision a separate virtual network with specific policies for each tenant. Network Function Virtualization (NFV) and SDN complement each other very well. NFV virtualizes network services and abstract them from dedicated hardware. Nowadays, there are plethora of physical devices which play a specialized role such as load balancer, routing, switching, WAN acceleration and content filter, etc. Service Providers consider NFV as the solution for deploying new network services by virtualizing network devices.
14
With SDN, an administrator can change any SDN based network switch’s rules when necessary — prioritizing, deprioritizing or even blocking specific types of packets with a granular level of control and security. Traffic loads are thus efficiently managed with lot of flexibility, specifically in a cloud environment where multi-tenant architecture is deployed. Essentially, this enables the administrator to use less expensive commodity switches and have more control over network traffic flow than ever before. End-to-end visibility of the network easing network management is one of the many benefits of SDN. In order to distribute policies to all the networked switches, there is no need to configure multiple individual network devices. In this case, configuring and dealing with one centralized controller is enough. If the controller deems traffic suspicious, for example, it can reroute or drop the packets. SDN also virtualizes hardware and services that were previously carried out by dedicated hardware, resulting in the touted benefits of a reduced hardware footprint and lower operational costs. Software-Defined Wide Area Network (SD-WAN) emerged from software defined networking using virtual overlay aspect. Connectivity links of a given organization in its WAN are abstracted to form a virtual network. The SDN controller uses any of the connections which deems fit to send and receive traffic. Let us see diagrammatically, the comparison between traditional WAN and SD-WAN.
Software-Defined Networking Use Cases As discussed, Software defined networking provide immense benefits as part of migration to virtual environment. SDN use cases in service provider environment with cloud computing architecture are very much effective. The Business Benefits of Software-Defined Network Solutions Dynamically changing needs of the business require programmable network, preferably centralized. SDN aptly caters to these business needs by dynamically provisioning the services in the network. It also provides the following technical and business benefits: • • • • •
Directly Programmable: Since control layer is decoupled from infrastructure layer, its directly programmable. Centralized Management: Controllers maintains a global view of the network and thus maintains central intelligence. Reduced OpEX/CapEx Deliver Agility and Flexibility Enable Innovation
Software defined networking will soon transform the legacy data centres into virtualized environment comprising networking, compute and storage. SDN adds flexibility in terms of controlling the network.
Bandwidth calendaring and WAN optimization are important needs of service providers which are met by SDN. SDN also offers bandwidth-on-demand and hence carriers can have control on links to opt for additional bandwidth on an ad-hoc basis. SDN adds value to cloud computing data centres by network virtualization. In a segregated network with multi-tenants, this is very important to achieve faster turnaround time and efficient utilization of resources in the cloud. SDN policies offer network access control and monitoring to enterprise campuses. Conclusion Together, SDN and NFV represent a path toward more generic network hardware and more open software. SDN with NFV is the future of networking and is becoming more and more the nucleus of modern data centre! At GAVS, we are tracking the SDN developments and adoption by various vendors and we are excited about the potential possibilities with SDN.
About the Authors: Chandrasekar Balasubramanian: Chandrasekar has 23 years of experience specialized in Networking. He is currently heading Networking Center of Excellence in GAVS with solid experience in Network Management, Network Security and Networking in general. He is passionate about Next Generation Networking technologies and is currently experimenting with it. He holds a couple of approved patents in Switching and Network security. Suresh Ramanujam: Suresh is a networking architect and part of Location Zero. He was associated with multiple global network/telecom service providers’ network transformation projects, improving network efficiency and quality of service with optimization of infrastructure (CAPEX and OPEX) by adopting breakthrough models. He is passionate about evolving networking technologies and the journey towards software defines everything.
15
Today it is said that data is the new oil. I will also add that data is the new gold. Industry 4.0 is focused on data. Data is now considered one of the most important commodities. ‘Big data’ has become an inevitable reality today but bigger isn’t always better, Big insights are more important than Big data.
The shifting pH of Databases from ACID to BASE
Bargunan Somasundaram
Now to extract value from gold or oil, it needs to be processed – fashioned into jewellery, minted into coins or refined to produce different petroleum products. Similarly, data must be processed and held in a vault (database or datastore). Big insights can be possible only with the right database for daily operations. An explosion of consumer data has enabled IT companies and giants to shift the pH of their databases from ACID to BASE. Let’s see how.
In the early years of computers, ‘punch cards’ were used for input, output, and data storage. Punch cards offered a fast way to enter data, and to retrieve it. After Punch cards, databases came along. Database Management Systems allowed us to organize, store, and retrieve data from a computer. It is a way of communicating with a computer’s “stored memory.” Airlines were one of the first industries that identified the need for relational databases. The SABRE system was used by IBM to help American Airlines manage its data. The datastores have started to evolve from the primitive approach of CODASYL to SQL (ACID) to NoSQL (BASE). Transactions The idea of transactions, their semantics, and guarantees, evolved with data management. As computers became more powerful, they were tasked with managing more data. Eventually, multiple users shared data on a machine. This led to problems of data being changed or overwritten while other users were in the middle of a calculation. This was an issue that needed addressing. Thus, the academics were called in and they came up with the ACID properties for transactions that could solve the consistency issues. In the context of databases, a sequence of database read/write operations that satisfy the ACID properties (these can be perceived as a single logical operation on the data) is called a transaction.
16
To understand the importance of transactions, consider this analogy -transferring money from one account to another. This operation includes the below two steps: 1. Deduct the balance from the sender’s bank account 2. Add the amount to the receiver’s bank account Now think of a situation where the amount is deducted from the sender’s account but is not transferred to the receiver’s account due to some errors. Such issues are managed by the transaction management, where both steps are performed in a single unit. In the case of a failure, the transaction should be roll-backed. Below are the basic tenets of the ACID Model, a set of guidelines for ensuring the accuracy of database transactions. 1. 2. 3. 4.
Atomicity Consistency Isolation Durability
interleaved. Isolation brings us the benefit of hiding uncommitted state changes from the outside world, as failing transactions shouldn’t ever corrupt the state of the system. Isolation is achieved through concurrency control using pessimistic or optimistic locking mechanisms Here is an example: If Bob issues a transaction against a database while Harry issues a different transaction, both transactions should operate on the database in isolation. The database should either perform Bob’s entire transaction before executing Harry’s or vice-versa. This prevents Bob’s transaction from reading intermediate data produced as a side effect of part of Harry’s transaction that will not eventually be committed to the database.
Bob’s Transaction Read Bob balance ($1000) Deduct $100 for movie Update account with $900
Atomicity It is the guarantee that a series of operations either succeed or fail together, because all components of a transaction are treated as a single action. If one part of a transaction fails, the database’s state remains unchanged, there are no partial updates. For example, a business transaction might involve confirming a shipping address, charging the customer and creating an order. If one of these steps fails, all should fail. Consistency Consistency is the second stage of the ACID model. A transaction either creates a new and valid state of data or, if any failure occurs, returns all data to its state before the transaction was started. For example, a column in a database may only have the values for Days as “Monday” to “Sunday”. If a user were to introduce a new day, then the consistency rules for the database would not allow it. Isolation Transactions require concurrency control mechanisms, and they guarantee correctness even when being
Harry’s Transaction
Read Bob’s balance which is $900 not $1000 Add $600 to Bob’s account Update Bob’s account (total = $1500 not $1600)
It is important to note that the isolation property does not ensure that a specific transaction will execute first, only that they will not interfere with each other. Durability After the successful completion of a transaction in the system, the data remains in the correct state, even in case of a failure and system restart. The Need for BASE Models Let’s go through the lifecycle of an application to understand the need for the BASE model. Let’s suppose an e-commerce application is developed. At the initial soft launch, the database is moved from a local workstation to a shared, remotely hosted MySQL instance with a well-defined schema. As soon as the application becomes popular, a problem arises. There are just too many reads hitting the database. This is quite usual with any application. The first attempt would be to cache frequently executed queries. Generally, memcached or any third-party cache providers like EHCache, OSCache, are employed for caching. But note that the reads are no longer in compliance with the ACID model. The data is inconsistent because it is in more than one place. This
17
also means that the cache is serving older/stale data till the time DB updates the cache. As the application’s popularity grows, new features like faceted search, on-page check out, customer reviews, live chat, etc. are introduced. If each feature was in its table, hundreds of joins would be required to prepare such a page. This would increase query complexity. To avoid too many joins, denormalization must be done. If the application’s popularity surges further, it will swamp the server and slow things down. Thus, serverside computations such as stored procedures must be moved to the client-side. Even after this, there would be some queries that are still slow. So, periodically the most complex queries are pre-materialized, and joins are avoided in most cases. Now, the reads might be okay, but writes are getting slower. Thus, the secondary indexes and triggers are dropped. At this point the DB is left with: • • •
No ACID properties due to caching. No normalized schema due to denormalization No stored procedures, triggers, and secondary indexes.
The ACID model is an overkill or would hinder the operation of the database. These issues gave birth to a softer model called BASE, which is extensively used by the NoSQL datastores. Basic tenets of BASE model
•
•
Eventual Consistency (Weak consistency)
When multiple copies of the data reside on separate servers, an update may not be immediately made to all copies simultaneously. So, the data is inconsistent for a period of time, but the database replication mechanism will eventually update all the copies of the data to be consistent. Conclusion Suitability of the ACID or BASE model varies case-bycase and depends on the read and write patterns. Transactions are omnipresent in today’s enterprise systems, providing data integrity even in highly concurrent environments. So, choose ACID when there is a need for strong consistency in transactions and the schema is fixed. In the age of IoT, AI/ML, High-Performance Computing is inevitable, and the computing requirements are astronomical. Eventual consistency gives the IT giants edge over others in the industry by enabling their applications to interact with customers across the globe, continuously, with the necessary availability and partition tolerance. All this, while keeping their costs down, systems up, and their customers happy. So, go for the BASE model datastores when there’s a high priority for availability and scalability and the schema is evolving. At the same time, BASE datastores don’t offer guaranteed consistency of replicated data at write time but in the future. BASE consistency model is primarily used by aggregate stores, including column family, key-value and document stores. Hbase, SOLR, cassandra, Elastic search are based on BASE models. Every relational database such as MySQL, postgresql, oracle and Microsoft SQL, support ACID properties of transactions.
Basic Availability
The datastore does guarantee availability, in the presence of multiple failures. Thus, the database appears to work most of the time because of replication. •
Soft State.
Soft State indicates that the state of the system may change over time, even without input. This is because of the eventual consistency model. In a way, datastores don’t have to be write-consistent or mutually consistent all the time.
18
About the Author: Bargunan is a Big Data Engineer and a programming enthusiast. His passion is to share his knowledge by writing his experiences about them. He believes “Gaining knowledge is the first step to wisdom and sharing it is the first step to humanity.”
Deepfake – they can be used to manipulate facts in politics, propagate fake news and harass individuals. Is this technology as dangerous as it is perceived, or does it have limitations like every other technology? To get to the bottom of this, we need to understand how it works and what sort of algorithms are used.
Deepfakes Another reason for you not to believe everything you see on the internet!
Deepfake is a technology that uses deep learning technology to fabricate entirely new scenes or alter existing videos. Although face-swapping has been prevalent in movies, they required skilled editors and CGI experts. For example, after the death of actor Paul Walker in 2013, the rest of the scenes were created with the help of his brothers and the VFX team. On the other hand, Deepfake uses machine learning systems to make the videos appear genuine and is usually difficult to identify by the layman. Deepfakes can be created or edited by anybody without editing skills.
GAN Model
Soundarya Kubendran
An episode of the latest season of the speculative fiction series, Black Mirror, explored the mounting risks of advanced technology in the entertainment industry. The episode depicted a pop star being replaced by her digital avatar. However, I wouldn’t call that speculative fiction anymore, with recent developments in technology having demonstrated the possibility of such scenarios in the near future.
Generative Adversarial Networks (GANs) are used in creating deepfake videos. GANs are a class of machine learning systems that are used for unsupervised learning. It was developed and introduced by Ian J. Goodfellow and his colleagues in 2014. GANs are made up of two competing neural network models – a generator and a discriminator - which can analyze, capture and copy the differences within a dataset. The generator creates fake videos and the discriminator detects if the generated videos are fake. The generator keeps creating fake content until the discriminator is no longer able to detect whether the created content is fake. If the dataset provided to the model is large enough, the generator can create very realistic fake content. FakeApp is one such application that can be easily downloaded by users to create deepfakes. The website
The technology I was referring to is Deepfake, a portmanteau of the terms ‘deep learning’ and ‘fake’. Deepfake has been splashed across news since 2017 when an explicit video with faces of celebrities doctored onto other actors was posted online. This sparked a conversation on the internet about the dangers of
19
https://thispersondoesnotexist.com/ generates a new realistic facial image of non-existent people from scratch every time the page is refreshed. The potential of this technology is concerning. Like Peter Singer, cybersecurity and defense-focused strategist and senior fellow at New America said, “The technology can be used to make people believe something is real when it is not,”. It can be misused by political parties to manipulate the public and feed them misinformation. It can also become a weapon for online bullying and harassment by releasing doctored videos. To raise awareness about the risks of misinformation, a video was released with Barack Obama’s face morphed onto filmmaker Jordan Peele. Fake videos of Mark Zuckerberg and Nancy Pelosi have also been doing rounds on the internet. Deepfake technology is already on the US government’s radar. California has recently banned the use of deepfakes in politics to stop it from influencing the upcoming election. The SAG-AFTRA (Screen Actors Guild-American Federation of Television and Radio Artists), which has been at the forefront of battling these technologies, commended the governor for signing the bill. The Pentagon, through the Defense Advanced Research Projects Agency (DARPA), is working with several of the country’s biggest research institutions to combat deepfakes. DARPA’s MediFor program (Media Forensics department) awarded non-profit research group SRI International based in California, three contracts for researching new ways to automatically detect manipulated videos and deepfakes. Researchers at the University at Albany also received funding from DARPA to study deepfakes. This team found that analysing the blinks in videos could be one way to detect a deepfake from an unaltered video because there are not many photographs of celebrities blinking. Some researchers have suggested the idea of watermarking the deepfakes to avoid misleading people. But as we know, watermarks can be easily removed. Interestingly, Blockchain could be a part of the solution. Registries of authentic data are being created and stored on Blockchain. Photos and videos can be verified against these registries of data. This is particularly useful for
20
journalists or activists to ensure the credibility of what they are sharing. There are a few limitations of the technology behind Deepfakes, at present. Firstly, GANs require a large dataset to train the model to generate photo-realistic videos. That’s probably why politicians and celebrities are more targeted. Also, to run this model, heavy computing power is needed which can be expensive. At about $0.50 a GPU hour, it costs around $36 to build a model just for swapping person A to B and vice versa, and that doesn’t include all the bandwidth needed to get training data, and CPU and I/O to pre-process it. FakeApp uses TensorFlow, a Machine Learning framework which supports GPU-accelerated computation using NVIDIA graphics cards. Even though the application allows to train models without a GPU the process might take weeks, instead of hours. As far as positive applications of this technology go, it can help filmmakers save a lot of money by morphing the face of popular actors onto the bodies of lesser known ones. Another interesting application which could result in a unique viewing experience would be to give the viewers a selection of actors to choose from to digitally place in a movie. However, this could put the jobs of actors at risk. As of today, the negatives of this technology far outweigh the positives. If strict regulations are not put in place, then it may as well turn our lives into a Black Mirror episode. References: https://www.csoonline.com/article/3293002/ deepfake-videos-how-and-why-they-work.html https://www.kdnuggets.com/2018/03/exploringdeepfakes.html https://medium.com/twentybn/deepfake-the-goodthe-bad-and-the-ugly-8b261ecf0f52 https://www.theverge.com/2019/10/7/20902884/ california-deepfake-political-ban-election-2020 https://www.theverge.com/tldr/2018/4/17/17247334/ ai-fake-news-video-barack-obama-jordan-peelebuzzfeed https://www.news18.com/news/buzz/zao-a-newchinese-ai-app-lets-you-swap-your-face-with-anycelebrity-in-8-seconds-2295115.html https://www.geeksforgeeks.org/generativeadversarial-network-gan/ https://thispersondoesnotexist.com/ https://www.theverge.com/tldr/2019/2/15/18226005/ ai-generated-fake-people-portraitsthispersondoesnotexist-stylegan https://www.wired.com/story/wired-cartoonsweek-7/ About the Author: Soundarya is a developer at GAVS and loves exploring new technologies. Apart from work, she loves her music, memes and board games.
The tens of billions of neurons and their connections to each other form the brain’s neural network. Although Artificial Neural Networks have been around for quite a few decades now, they are now gaining momentum due to the declining price of storage and the exponential growth of processing power. This winning combination of low-cost storage and high computational prowess is bringing back Deep Learning from the woods.
A Deep Dive into Deep Learning!
Improved machine learning algorithms and the availability of staggering amounts of diverse unstructured data such as streaming and textual data are boosting the performance of Deep Learning systems. The performance of the ANN depends heavily on how much data it is trained with and it continuously adapts and evolves its learning with time as it is exposed to more & more datasets. Simply put, the ANN consists of an Input layer, hidden computational layers, and the Output layer. If there is more than one hidden layer between the Input & Output layers, then it is called a Deep Network. The Neural Network The Neuron is central to the human Neural Network. Neurons have Dendrites, which are the receivers of information and the Axon which is the transmitter. The Axon is connected to the Dendrites of other neurons, through which signal transmission takes place. The signals that are passed are called Synapses.
Padmapriya Sridhar
Kathakali Basu
The Nobel Prize winner & French author André Gide said, “Man cannot discover new oceans unless he has the courage to lose sight of the shore”. This rings true with enterprises that made bold investments in cutting-edge AI that are now starting to reap rich benefits. Artificial Intelligence is shredding all perceived boundaries of a machine’s cognitive abilities. Deep Learning, at the very core of Artificial Intelligence, is pushing the envelope still further into unchartered territory. According to Gartner, “Deep Learning is here to stay and expands ML by allowing intermediate representations of the data”. What is Deep Learning? Deep Learning is a subset of Machine Learning that is based on Artificial Neural Networks (ANN). It is an attempt to mimic the phenomenal learning mechanisms of the human brain and train AI models to perform cognitive tasks like speech recognition, image classification, face recognition, natural language processing (NLP) and the like.
While the neuron by itself cannot accomplish much, it creates magic when it forms connections with the other neurons to form an interconnected neural network. In artificial neural networks, the neuron is represented by a node or a unit. There are several interconnected layers of such units, categorized as input, output and hidden, as seen in the figure.
21
content like images, photos or videos. This includes helping the computer learn & perform tasks like Image Classification, Object Detection, Image Reconstruction, to name a few. Image classification or image recognition when localized, can be used in Healthcare for instance, to locate cancerous regions in an x-ray and highlight them.
The input layer receives the input values and passes them onto the first hidden layer in the ANN, similar to how our senses receive inputs from the environment around us & send signals to the brain. Let’s look at what happens in one node when it receives these input values from the different nodes of the input layer. The values are standardized/normalized-so that they are all within a certain range-and then weighted. Weights are crucial to a neural network since a value’s weight is indicative of its impact on the outcome. An activation function is then applied to the weighted sum of values, to help determine if this transformed value needs to be passed on within the network. Some commonly used activation functions are the Threshold, Sigmoid and Rectifier functions. This gives a very high-level idea of the generic structure and functioning of an ANN. The actual implementation would use one of several different architectures of neural networks that define how the layers are connected, and what functions and algorithms are used to transform the input data. To give a couple of examples, a Convolutional network uses nonlinear activation functions and is highly efficient at processing nonlinear data like speech, image and video while a Recurrent network has information flowing around recursively, is much more complicated and difficult to train, but that much more powerful. Recurrent networks are closer in representation to the human neural network and are best suited for applications like sequence generation and predicting stock prices. Deep Learning at work Deep Learning has been adopted by almost all industry verticals at least at some level. To give some interesting examples, the automobile industry employs it in selfdriving vehicles and driver-assistance services, the entertainment industry applies it to auto-addition of audio to silent movies and social media uses deep learning for curation of content feeds in user’s timelines. Alexa, Cortana, Google Assistant and Siri have now invaded our homes to provide virtual assistance! Deep Learning has several applications in the field of Computer Vision, which is an umbrella term for what the computer “sees”, that is, interpreting digital visual
22
Deep Learning applied to Face Recognition has changed the face of research in this area. Several computational layers are used for feature extraction, with the complexity and abstraction of the learnt feature increasing with each layer, making it pretty robust for applications like public surveillance or public security in buildings. But there are still many challenges like the identification of facial features across styles, ages, poses, effects of surgery that need to be tackled before FR can be reliably used in areas like watch-list surveillance, forensic tasks which demand high levels of accuracy and low alarm rates. Similarly, there are several applications of deep learning for Natural Language Processing. Text Classification can be used for Spam filtering, Speech recognition can be used to transcribe a speech, or create captions for a movie, and Machine translation can be used for translation of speech and text from one language to another. Closing Thoughts As evident, the possibilities are endless and the road ahead for Deep Learn is exciting! But, despite the tremendous progress in Deep Learning, we are still very far from human-level AI. AI models can only perform local generalizations and adapt to new situations that are similar to past data, whereas human cognition is capable of quickly acclimatizing to radically novel circumstances. Nevertheless, this arduous R&D journey has nurtured a new-found respect for nature’s engineering miracle – the infinitely complex human brain! About the Authors: Padmapriya Sridhar: Priya is part of the Marketing team at GAVS. She is passionate about Technology, Indian Classical Arts, Travel and Yoga. She aspires to become a Yoga Instructor some day! Kathakali Basu: Kathakali is part of the content team at Infoholic Research, she is an avid reader and a classical dancer. She likes to travel and aspires to become an animal enthusiast in future.
“It only takes 10% of a population holding an unshakeable belief to convince the rest of the population to adopt the same belief.” ~SNARC Have you ever thought about what would happen if the top 10% of your happy customers went out and told your other customers and prospects how great your company is? Have you ever considered the impact of having your greatest advocates out in force to help tell your story?
Do You Have a Strategy for Leveraging Your Key Customers?
Customer-driven growth is not just a good idea. It’s imperative in a highly dynamic and competitive market. There is no dearth of content written about “Advocacy Marketing,” and we’re talking about much more than getting testimonials. What I am suggesting is that companies develop compete strategies around the concept of Customer Leverage. lev·er·age /ˈlev(ə)rij,ˈlēv(ə)rij/ the power to influence a person or situation to achieve a particular outcome. By creating a systematic and holistic approach, companies can leverage the power of their customer relationships for accelerated growth.
Betsy Westhafer
Take for instance, a Customer Advisory Board. In this setting, customers sit face-to-face with the executive leaders from their vendor or partner, providing insights beneficial to the host company while having the opportunity to influence the direction of a key supplier. In addition, board members get to network and share best practices with their peers while the executives from the host company build deeper trusting relationships with those in attendance. Win-win all around!
And here’s where the magic happens. Because of the nature of the advisory board, members feel compelled to help guide the host company to success, which includes advocating on their behalf. Because the relationships have been built in a confidential, transparent setting, trust is high and so they are now more open to participating in various modes of advocacy efforts. This may include co-writing a white paper, participating in a case study, sharing the stage for a panel discussion, cohosting a webinar, or any assortment of activities that provide mutual value.
23
While many companies do this on an ad-hoc basis, the real winners are the organizations that have a customer leverage strategy that encompasses all aspects of customer-driven growth.
Categories for the audit discussions include, but are not limited to, Advocacy, Strategy, Innovation, Networking, and Internal Alignment. Each category is averaged to identify areas of strength and weakness.
Consider this framework for a strategic approach to customer leverage:
Here’s an example:
» Audit Establishes a baseline from which to identify and enhance areas in which you can effectively leverage key customers; provides your Customer Leverage Score™ (CLS). » Advisory Board Creates an ongoing system for leveraging customers in a strategic advisory capacity and deepens and strengthens trusting customer relationships. » Advocacy Programs Leverages the strength of key relationships to develop opportunities for customers to advocate on your behalf, while at the same time, creating value for themselves. » Analytics Indicates the success of the Customer Leverage Strategy by monitoring Key Performance Indicators and Return on Investment (ROI). When approached in this manner, companies are more likely to find success in their efforts.
Category: Strategy Question: Do we let our customers know how valuable they are to us by giving them an opportunity to influence our decision making and strategic direction. Possible answer: Yes, we have a Customer Advisory Board where our customers can influence our decision making and strategic direction. We are not consistent with having our board meetings, however, and when we do have them, they are only moderately successful. This is a high priority for us, but we just can’t seem to get the internal resources aligned to make them happen consistently and effectively. The CLS asks various questions in the format you see above and serves as a baseline to measure the effectiveness of a customer leverage program. It’s recommended that the CLS audit be repeated on an annual basis to monitor progress. Other Key Metrics As with any great strategy, it’s important to have key performance indicators to ensure the strategy is securing the intended outcomes. Other key metrics for a customer leverage program may include: • • • •
Advocacy efforts leading to new business Retention of Customer Advisory Board members Account expansion among advisors and advocates Others
Leveraging your key customers is undoubtedly the fastest way to win in your market and creates a significant and unique competitive advantage. There truly is no downside to utilizing your key relationships for mutual benefit. What Is a Customer Leverage Score™?
Sources:
Much like the NPS (Net Promotor Score) measures the loyalty of a company’s customer relationships, the CLS™ measures the organization’s ability to leverage those loyal relationships.
https://visionedgemarketing.com/four-keyconsiderations-for-creating-advisory-boards/
During a CLS audit, various members of a leadership team are asked a series of questions regarding their current ability to leverage their customers. They come in the form of yes/no questions, along with a measure of consistency and effectiveness. An index score is calculated and then each area is measured in terms of perceived priority for the organization. Combining the index score with the prioritization leads to the overall Customer Leverage Score. Both quantitative and qualitative data are considered.
24
https://technologyadvice.com/blog/marketing/makecustomer-advocacy-fun-easy-rewarding/ About the Author: Betsy Westhafer is the CEO of The Congruity Group, a US-based consultancy focused on customer leverage programs. She is also the author of the #1 Best Seller, “ProphetAbility – The Revealing Story of Why Companies Succeed, Fail, or Bounce Back,” available on Amazon.
I am among those many who are intrigued by how some organizations can seamlessly surpass expectations and deliver a great customer experience. I find it especially revealing when the concept is examined in the context of organizations with an array of seemingly every-day products or services. Let us consider the case of a simple Indian restaurant that I frequented with my wife when we spent a week in Bali.
Demystifying Customer Centricity - A simplistic take
In case you are wondering why we went searching for Indian food in another country (ask again), it was the only one open to our Airbnb when we landed up there at 10 pm. We were driving from the airport, hungry and really needed a decent meal. With clear guidance from the missus about avoiding fine dining given the time they take to bring what we might order, we were looking for something simple, preferably a cuisine that reminded us of home. We stumbled upon this small Indian restaurant close to our Airbnb that was decently crowded. It also had a decent amount of space to park our car (which we later realized is a luxury in Bali). So, we went in and ordered Palak Paneer (cottage cheese in spinach gravy) and some Jeera Rice (cumin rice) each. Outright, it was delicious! The incident that followed is really what caught my eye. After having the palak paneer and rice, we were still hungry. But, knowing another dish each would be too much, we decided to share something. We called on our waiter, a tall, dark guy perhaps in his mid-forties. He was wearing a pale white shirt, a pair of shorts, and a faded purple towel hung on his shoulder.
Vidyarth Venkateswaran
“Anything else, sir?” he asked “What other North Indian dishes do you have?” my wife asked As he started rattling off all possible combinations of paneer, followed by some chaat (savoury Indian snack), we stopped him at Chole Bhature. For the uninitiated, it is handpicked dough made of wheat flour rolled into a thin sheet and deep-fried to golden brown perfection served with a lip-smacking chickpea gravy with a pinch of coriander (CREDITS: A fine dining menu), or as a good friend of mine describes it in layman terms, it is a poori (an Indian flatbread) the size of an inflated airbag served with chickpea curry and a dollop of butter on it. Once we had decided, my wife and I agreed to order for a portion. but that we would eventually split it. The waiter nodded and went inside the kitchen. After a few minutes, he came with the Chole Bhature that was already cut in half. He had informed the chef that we’re planning to share it and requested that the bhatura be cut in half before deep-frying it. As he was serving it, he alluded to how his grandmother always used to emphasize that the root of all arguments between couples was an unhappy stomach, which is why
25
he did not want to give it a chance. My wife and I were both surprised and delighted with the whole experience. The waiter understood that it would be messy to tear a full bhatura in half. So, he told the chef to cut it in half and fry it to make our lives easy. I thought it was a great example of how to nail customer experience. An article I read recently on Forbes spoke about a study conducted under the umbrella of the American Customer Satisfaction Index. The study focused only on customers in the USA who participated in objective evaluations of the quality of goods and services purchased in America and produced by domestic and foreign firms with substantial US market shares. The results revealed a common thread in their “Claim to fame”. Here’s what I found: •
•
•
Genuinely caring about customer outcomes makes a real difference. Remembering that all business is eventually a transaction between human beings is critical. Genuinely taking an interest in the customer’s pain points, goals, and objectives rather than focusing on the task/transaction make it real. Recognizing that customer experience is not a trade-off. While firms are constantly dealing with real-world pressures of profitability and costs, the ones that believe positive customer experience is non-negotiable make their mark. They are able to inspire loyalty and almost build a fan base that tides them through thick and thin. Investing in providing a positive employee experience is crucial. The famous words of Richard Branson about how happy employees make for a happy customer needs no references.
Case in Point – In Phil Knight’s Shoe Dog, so much was said about the early years of Nike. The salesmen who worked in Nike stores maintained a personal relationship with every aspiring athlete — be it someone who is part of the university running team or a professional athlete. They knew every athlete’s requirement, their upcoming races, etc. Some even send postcards to the athletes to know about their race. This was one of the reasons for star athletes to endorse Nike when they were at the peak of their careers. Nike cared and the athletes reciprocated it when they become famous. We at GAVS, are proud of the focus and emphasis we place on customer centricity as part of our culture at the firm. Our transition from being focused on being enablers of Delivery Excellence to enablers of Customer Success as we have grown from strength to strength is a testament to what the concept means to all of us as partners at the firm. This is seen in everything we do from how we host our customers and partners at our office to the accelerators and enablers we use as part of our Customer Success Management framework in solutioning and delivery. As a GAVSian who is relatively new to the system, I am happy to see such a refreshing approach to Customer Centricity.
About the Author: Vidyarth is an Associate Vice President at GAVS with the Customer Success team at GAVS Technologies. An ex-Accenture Strategy client engagement manager with over 8 years of Strategy and Management Consulting experience across multiple industries, he has managed projects and client accounts across USA, Europe, Singapore, Japan and Malaysia. He has led numerous engagements enabling digital transformation, operating model definition and improvement, business process and IT architecture and Supply Chain transformation for clients. In his role at GAVS, Vidyarth is currently responsible for, and drives strategic interventions that are focused on transforming GAVS’ business practices to enable best-in-class IP and solution-delivery to our clients.
26
“It is not the strongest of the species that survives, nor the most intelligent that survives. It is the one that is the most adaptable to change.� - Charles Darwin
27
28