T E C H FAST LY
A p r i l | 2021
ETL OR ELT
BUILT VS BUY
IN CONVERSATION WITH
ARNAB PANDEY
ARE YOU JUST “DOINGDIGITAL AGILE” ACCELERATION OR “BEING AGILE” IN THE TIME OF CORONAVIRUS THE FUTURE OF
ROBOT AND WORK
IS ARTIFICIAL INTELLIGENCE MEET A THREAT TO CLUBHOUSE JOURNALISTS AND WRITERS?
SENIOR 1 ANALYST FOR DERIVATIVES AND FOREIGN EXCHANGE | QUANT AND FINTECH ENTHUSIAST | PADMA SHRI NOMINEE 2021
C O N T E N T S
P. 74
Story of Machines — The Computer in Your Hands
How These Pioneers Reshaped Our World
P. 11
ETL Or ELT: Build vs Buy
2
T E C H FA S T LY | A P R I L 2 0 2 1
P. 25
Cloud Computing in Health Care Industry
P. 5
P. 46
P. 33
Why Is ELT Better For Cloud Data Warehousing?
Digital Acceleration in The Time of Coronavirus
P. 52
The Future of Wealth Management — Tech or Hybrid?
Interview: Arnab Pandey P. 65
P. 92
P. 84
Is Artificial Intelligence a Threat to Journalists and Writers? Technology and Stock Market A Millennial’s Take
What Does AI Have In The Box For E-commerce?
3
T E C H FA S T LY | A P R I L 2 0 2 1
THE
The time for your business to digitally transform is now. The process of digital transformation involves leveraging the latest tools of Artificial Intelligence, IoT devices, Cloud Migration, Big Data, Blockchain, and other high-end technologies. The COVID pandemic has accelerated the speed of digital transformation for many companies. We have explored how digital transformation has helped accelerate businesses amidst the pandemic.
Migrating your data to a cloud platform is a crucial step for digital transformation. Most of the companies are opting for the ELT method to handle their analytical data. We have weighed in the pros and cons of both the ELT and ETL methods for cloud data warehousing. We have also analyzed if you should ‘build’ your own data warehouse or ‘buy’ it from a third-party.
FROM
EDITOR We have also looked at the positive side of artificial intelligence’s ethical dilemma. A few more articles around AI will cover its impact on the news industry and e-commerce businesses.
In our interview series, we had Arnab Pandey, Padmashri Award Nominee 2020, this time with us. Arnab is a board member of the D&I council for Fixed Income at Raymond James. We talk with him about the capital markets and how technology is changing the landscape. He also discussed with us the application of AI tools in the market and what the future holds. We are introducing the ‘Story of Machine’ series in our attempt to bring something new for you. We plan to cover the innovations that help ease our life. This month, we begin with ‘The Computer in Your Hands – How These Pioneers Reshaped Our World’. We would appreciate your feedback and are always open to suggestions. Here is your edition. Do read it, enjoy it, and nourish in the knowledge. Happy reading! Thank you Srikant Rawat COO, Techfastly
4
T E C H FA S T LY | A P R I L 2 0 2 1
Thomas Davis
Digital Acceleration in the Time of Coronavirus
5
T E C H FA S T LY | A P R I L 2 0 2 1
6
T E C H FA S T LY | A P R I L 2 0 2 1
w
e all knew technology would be a disruptive force in our lives. But we didn't expect it to happen this fast. Businesses have had to adjust to the work landscape that hapw overnight, thanks to the coronavirus. Businesses that weren't prepared for the digital age received a wake-up call. This left most of them scrambling and trying to figure out how to adjust on the fly. Problems they never had before came out of nowhere and had to be dealt with like virtual meetings, migrating operations, employee productivity and mental health, infrastructure issues, etc. Those companies that had the foresight to begin planning ahead of time were much better prepared. They were well-positioned to leapfrog their competitors who weren't prepared. This does not mean that they could have seen this coming., but they will be in a more advantageous position to come out on the other side of this pandemic, stronger.
7
T E C H FA S T LY | A P R I L 2 0 2 1
Those companies that had the foresight to begin planning ahead of time were much better prepared.
8
T E C H FA S T LY | A P R I L 2 0 2 1
Companies Foresight
“
For instance, RXR Reality was one of those companies that well-positioned themselves ahead of time. The New York City-based commercial and residential real estate developer began investing in digital capabilities that set it apart from its competition.
Historically, real estate has been a very transactional business. We felt that by leveraging our digital skills, we could create a unique and personalized experience for our customers similar to what they’re used to in other aspects of their lives”, explains Scott Rechler, CEO of RXR.
Before the global pandemic hit, RXR had established a digital lab. This included over 100 data scientists, designers, and engineers working across the business to complete digital initiatives. Their company was so far ahead with creating an app that could provide services without human intervention. This included enabling move scheduling, deliveries, dog walking, rent payments on the residential side, and real-time analytics on heating, cooling, and floor space optimization for tenants on the commercial side. Suddenly the need for social distancing and contactless interactions was necessary for the tenants of the buildings. This allowed RXR to pivot quickly and seamlessly to meet this need. RXR Reality received 2020 Digie Award – Best COVID Tech – for their preparedness. Majid Al Futtaim Retail, a Dubai-based conglomerate that runs retail food and grocery stores in the Middle East, is another example of an organization ahead of time. The company began building its digital presence long before COVID-19 hit. They began assembling their online presence in 2015 to operate alongside their 315 brick-and-mortar stores across 16 countries. But progress was slow since they didn’t see the need for any urgency. Then the pandemic hit, and the company was forced to speed things up. “The pandemic pushed 9
T E C H FA S T LY | A P R I L 2 0 2 1
us to accelerate our digital transformation. We are implementing things in the coming 18 months that we originally wanted to achieve in five years,” said Hani Weiss, CEO of Majid Al Futtaim Retail. This sudden increase in demand meant Majid Al Futtaim had to move fast. They quickly converted some physical stores to fulfillment centers, but the stores proved to be too small. So, logistics managers swiftly arranged to have a 54,000 -square-foot fulfillment center tent erected and operational in five weeks. The centers were complete with frozen and chilled food rooms, making it easier to stock more than 8,000 items ready to ship. Hani Weiss credits initiatives such as Click and Collect with its redesigned app to meet the increased demand. The app made it easier for customers to use and launched contactless payment options such as Mobile Scan and Go in its stores. This allowed customers to scan items and pay with their smartphones. These tools allowed the company to launch an online marketplace with 420,000 new products from other retailers whose stores were closed during the lockdown. “No matter how our customers want to shop, we can be there for them,” Weiss says. “We developed this agility through the pandemic, and I want to keep it as we go forward.”
Where Do Companies Go From Here? COVID-19 changed the digital landscape for companies like no technology has ever. It transformed the way we work forever, and companies have no choice but to adjust quickly. The digital acceleration accomplished in a few months what would normally take a couple of years.
This new environment forced companies to adopt new strategies and plans for the future. This will would be no easy task, especially if they weren’t prepared. Organizations have to change immediately or risk being left behind and having their business disappear or become irrelevant. They need to make better informed and faster decisions, focusing on automation, real-time risk assessment and mitigation, continuous value delivery, and agile strategy. Functional leaders should begin thinking about speed and efficiency if they haven’t started already. Cut out anything that drags or slows down processes, procedures, or customer interaction. When we look at the future of work, there must be a few factors that need to be addressed. Things
such as intellectual, social, and human capital have to be considered, and decisions have to be made to best move forward. Companies must leverage individual and group decision-making that includes different stakeholders. These would be technology folks, analytics people, subject matter experts, and creative out-of-the-box thinkers. The use of technology will be a crucial piece in bringing people and ideas together to work as one. Various communication methods such as social media, apps, chatbots, and video conferencing will be front and center for your business to be successful. Lastly, businesses need to understand how best to communicate with vendors and external parties, and customers to survive in this new landscape.
Conclusion The coronavirus has sped up digital acceleration like nothing any of us has seen before. The business was headed in that direction, albeit slowly. The way people work and shop has changed forever. Tons of organizations shut down brick-and-mortar stores, with a number of them to never return.
atmosphere. Many stores couldn’t pivot this way and had to deal with the harsh, grim reality. But those that took advantage seized an enormous opportunity to put them on the road to success. Digital acceleration caused a huge interruption in our lives, but we will all be better for it in the long run.
These went from a physically interactive environment to a more technically intensive
10
T E C H FA S T LY | A P R I L 2 0 2 1
Anjali Prabhanjanan
ETL Build v OR ELT: 11
T E C H FA S T LY | A P R I L 2 0 2 1
y u B s v
12
T E C H FA S T LY | A P R I L 2 0 2 1
INTRODUCTION
:
T
he significance of ETL in an organization is directly proportionally to how much they rely on data warehousing. The ETL tools collect, read, and migrate vast volumes of raw data from multiple data sources and across different platforms. They load that data into a single database, data warehouse, or data store for easy access. They process the data to make it meaningful with operations such as sorting, filtering, joining, merging, reformation, and aggregation. Ultimately, they have graphical interfaces for quick and easier results compared to the traditional methods of moving data through hand-coded data pipelines.
13
T E C H FA S T LY | A P R I L 2 0 2 1
This tool helps break down the data silos making it easier for a data scientist or engineer to access and analyze data, and turn them into business intelligence (BI). These tools are the first essential step in the data warehousing process that helps you make more informed decisions in a quick time. ETL tools are crucial for a data-driven business, that helps to extract and transform data from several disparate sources before loading them into the main data repository.
NEED FOR ETL TOOLS: ETL tools can help your business grow in the following ways: 1. Time efficiency This tool allows you to collect, transform, and consolidate data in an automated way, which eventually helps to save plenty of time and effort, that’s otherwise spent on importing data manually.
2. Reduced error probability You are prone to making errors when handling data manually, even if you are careful with it. Moreover, a slight error in the early stages of the data processing can be dicey, as one error leads to another one, and the cycle continues. For instance, if you enter sales data incorrectly, your entire calculations can go wrong. ETL tools help automate several parts of a data process, reduces the manual intervention, and error probability.
3. Easily handle complex data Over time, your business will have to work with a vast data volume that is diverse and intricate. For example, you can be a multi-national organization with data coming in from three different countries with distinct product names, addresses, customer IDs, etc. So, if you have to manage a range of attributes, you may end up formatting data all day long. This is where an ETL tool can help to streamline things for you by simplifying data cleaning and processing.
4. Improves business intelligence and return on investment (ROI) ETL tool helps ensure that the data for analysis is of the finest quality. So, you can use this highquality data to make better decisions and increase your ROI. ETL software helps enterprises to get significant insights that support their business development with data from numerous sources in a practical arrangement. It helps to streamline and improve the process of blending raw data distributed across several systems into a data repository. So, selecting the right ETL plays a crucial role in your BI.
14
T E C H FA S T LY | A P R I L 2 0 2 1
Understanding ELT & ETL The significance of ETL in an organization is directly proportionally to how much they rely on data warehousing. The ETL tools collect, read, and migrate vast volumes of raw data from multiple data sources and across different platforms.
ETL (Extract, Transform, Load) has been the traditional approach for data warehousing and analytics for the past couple of decades. The ELT (Extract, Load, Transform) approach changes the old paradigm as the transform and load are being switched. Both the ETL and ELT solve the same need as they help to clean, manage the raw data by readying them to be analyzed. Billions of data and events are needed to be collected, processed, and analyzed by businesses by enriching, molding, and transforming them to make them meaningful. The working methodologies of both these tools are different and this leads to new possibilities in many modern data projects. These differences include how raw data is managed when processing is done and how analysis is performed.
15
T E C H FA S T LY | A P R I L 2 0 2 1
Let us discuss, the technological differences between ETL and ELT showing data engineering and analysis in these approaches of the three stages: E, T, L: 1. Extraction It involves retrieving raw data from an unstructured data pool and migrating it into a temporary, staging data repository.
2. Transformation This stage includes converting, structuring, and enriching the raw data to match the target source.
3. Loading It involves loading the structured data into a data warehouse to be analyzed and used by BI tools.
ETL vs. ELT the ETLneeds management of the
raw data, including the extraction of the needed information and running the right transformations to eventually serve the needs of the business. All the three stages of E, T, and L needs interaction by data engineers and developers and dealing with the capacity limitations of the traditional data warehouses. Using ETL tools, data analysts and other BI users have become accustomed to waiting because of the nonavailability of simple access to the information until the completion of the whole
process. The transformation time for ETL increases as the data size grows. It uses the staging area and system which requires more time for loading the data. Since the ETL is based on multiple scripts for creating the views, so deleting it means deleting the data. It requires high maintenance as the data must be chosen before transformation and loading. Also, in case if it got deleted or you want to enhance the main data repository, you need to do it all again.
ETL
needs less space at the early stage and the result is clean. It is used primarily by IT and has fixed tables and timelines. ETL includes presuming and choosing the data a priori. It uses a prevalent legal model for the on-premises, relational, structured data. It is not a part of the data lake support approach. ETL is less costeffective for SMEs (Small and Medium-sized Enterprises) and businesses, especially as the data size increases over the years. Whereas, in the ELT approach, once you’ve extracted the data, you will immediately start with the loading phase that involves moving all the data sources into a single, centralized data repository. With the modern infrastructure technologies using the cloud, now the systems can support huge storage and scalable compute. So, a large, expanding data pool and fast processing are virtually endless to maintain all the extracted raw data.
The ELT approach is a modern alternative to ETL. The ELT process is still evolving because the frameworks and tools to support it are not always fully developed to facilitate the load and processing of large volumes of data. However, it is quite promising as it enables unlimited access to all your data at any time and saves the efforts and time of developers for BI users and analysts. an all-in-one ETLissystem, so the
time and speed of transformation are not dependent on the data size and it will load the data only once. As it creates ad hoc or temporary views, so the cost for building and maintaining them is quite low. Also, ELT is low maintenance since the data is always available.
ETL
needs an in-depth knowledge of the tools and the expert design of the main data repository. It is used by everyone including developers and citizen integrators as it is ad hoc, agile, and flexible. It is by HW (none) and data retention policy.
to ETLisusecustomized in scalable
cloud infrastructure that supports structured, unstructured big data sources. It enables using the data lake approach with support to the unstructured data. ELT is scalable and available for all company sizes that use online SaaS.
16
T E C H FA S T LY | A P R I L 2 0 2 1
ETL FUNCTIONS, BENEFITS, AND LIMITATIONS ETL is outdated as it helped to cope with the limitation of the traditional rigid and data center infrastructures which is no longer a barrier today with the cloud. Enterprises with large data sets of even a few terabytes, load time can take hours, depending on the complexity of the transformation rules. ELT is a vital part of modern data warehousing. Using this approach, business organizations of any size can capitalize on the current technologies. By analyzing larger pools of data with more agility and less maintenance, companies can gain key insights to create a real competitive advantage while excelling in their business. Now that you know both the ETL and ELT tools, which one you need and why. Also, whether you need to build your own tool or buy it. Here we’ll explore why buying tools might work best for some, while hand-coding the ETL layer may work better for others. But before diving into that, here’s a quick overview of the function, benefits, and limitations of the ETL process. The ETL process describes the infrastructure built to pull or extract the raw data from a source for analysis, transform it into some useful form for a business need, and deliver or load it to a data warehouse. This automated process is an integral part of the data pipeline of any business or organization. A traditional way to implement ETL is to process the data in batches, once every 24 hours. Batch processing involves stopping at some point to store and collect the data while the system ‘forklifts’ the data over to your data warehouse. The loaded and transformed data is then available for BI tools to provide the insights and visualization layer to the enterprise. Powerful software frameworks like Hadoop MapReduce or Stitch have been designed for batch processing of huge datasets. A more modern approach is to have ETL pipeline perform a stream of processing of data, especially where the real-time analytics are crucial, latency matters, and extracting and transforming large batches of data in bulk may be counterintuitive for the business needs of an organization. The stream processing tools, such as Apache’s Spark of Kafka, process data inmemory, greatly reduces the time it takes for the data to become actionable.’ Though the batch processing lets the data build up, stream processing renders the data more steadily accessible by spreading the processing over time. The batch processing involves moving data in bulk, validating, and cleaning the data, and then staging it in a database before loading it to the target tables within the data warehouse. Whereas, steam processing involves moving data at speed, and should be viewed as preferable or mandatory when the caliber of insight and value possible from touching data immediately reduces as the time component increases. 17
T E C H FA S T LY | A P R I L 2 0 2 1
BUILDING THE TOOL When deciding whether to build or buy, start at the beginning. This may sound counterintuitive and include scenarios where an organization realizes it is way past the starting line and galloping along an inefficient trajectory concerning data.
1. Form a schematic defining the scope
Putting a plan together helps to describe your project parameters and also understand your business needs. The top priority of your list involves making your data make sense and also count on clean and useful data delivery which is ready for them to derive insights. To start building this plan, start with a few straightforward questions, such as: • What is your top business needs to be addressed? • What data jobs are essential? • What business needs do you want to address, but haven’t been able to do so? • What data jobs would you like to incorporate, but didn’t think it is possible or viable? • How much money will be saved hand-coding versus buying a tool and for how long? • If you are building your own ETL, how much time will you need to correctly develop and test it? • Will these pipelines scale as the company grows? • If you are using hand-coding, how responsive will the in-house team be to a failure? • Can you commit to long-term maintenance, not just a one-time investment in the setup? • Conversely, if you are using outside tools for your ETL, how agile are those tools for your company’s current and changing needs? Each company has different requirements, yet none of them will benefit from an unnecessarily sophisticated or poorly structured pipeline as it will underperform.
18
T E C H FA S T LY | A P R I L 2 0 2 1
2. Describe the project guidelines for the desired ETL system Describing the project guideline by documenting everything concisely will help those who rely on it such as the DBAs, the IT team with easy reference. Whether you decide to hand-code the ETL or buy a tool, successful installation of data integration means everything is configured effectively. Further, it means that you have completed testing with representative data sources, including the intentional introduction of corrupted data at certain iterations to ensure that the system is functioning as desired. Whenever possible, the tests should be automated and a support system should be in place, with any training available as needed. As each step is executed in constructing and implementing your organization’s preferred ETL method, ensure that the initial questions are answered and that the acceptance testing is treated as though it had real-world significance. Adopt proactive measures such as a regular audit and profiling target tables of the system as a whole to avoid any potential data quality issues. With the build scenario, you know the metrics that matter, have control over your data pipeline and how you want everything configured. If you are leaning toward this solution, you should consider the following finer points:
1) Building a hand-coding system offer a level of flexibility that a tool can’t provide Using hand-coding, you can build anything you want. Whereas, a tool-based approach may be limited for its available features. You can tailor a hand-coded system to directly manage any metadata and this increased flexibility might be desirable depending on your organization’s needs. • Before deciding to build, it is vital to ask yourself a few key questions: • Is there a dedicated in-house data team for writing scripts and managing the workflows? • If there is a single data scientist with no team, for whom this is their sole focus, is the task of hand-coding ETL relegated to someone’s second job? • Can the ETL team handle the maintenance tasks such as documentation, testing, and optimization as well as a setup? Even with the availability of the coding chops to build the pipeline, you may lack enough time or bandwidth to do the job properly for the future.
19
T E C H FA S T LY | A P R I L 2 0 2 1
2) Total control, visibility, and responsibility Having control is awesome. Does the control of running data jobs whenever and wherever you determine to outperform the use of tools you could buy? Does a custom data integration solution render any present part of the ETL tool unnecessary overhead that needs trimming from the productivity costs? If so, then the hand-coding looks extremely viable. However, “With great power comes greater responsibility.”
• So, how are you positioned to handle things when data fails to load, or loads only partially, with that failure potentially not even in the ETL layer but somewhere within the raw source data? • Are the necessary checks built-in it swiftly handles the situation? • Will the new data sources or connection changes be integrated seamlessly? It is significant to understand that building the data pipelines means figuring out how to manage all the issues with the internal resources.
3) Cost-effectiveness If there are budgetary concerns, and your current data pipeline needs do not need everything offered by an ETL tool, then building a solution may be advantageous. However, along with budgeting for maintenance, it is imperative that the ETL testing is done seriously. You should place checks throughout your ETL, with regular monitoring and troubleshooting to address any data quality issues head-on and ensure optimal performance. Also, remember that the investment in the tech infrastructure is required to build your system, including the direct costs to keep its operation. Will the money saved now with hand-coding continuously pay off in the long run? So, if the type of data integration needs for your organization is highly unique, building your ETL may the best and cheaper option. However, it may not cover all your needs. It has scalability limitations as your business may grow further and have a large volume of data that needs to be analyzed for BI. Your company may not have a compatible and dedicated team of engineers who meets the demand for constant maintenance, documentation, and testing of the data integration tool.
20
T E C H FA S T LY | A P R I L 2 0 2 1
Buying
the Your enterprise wants the most robust data pipeline and when it comes to data visualization you are determined to create an environment that allows for ownership and autonomy. However, you don’t have the desire, time, or expertise to manage your own pipeline. This is where a prebuilt solution could be the best fit for your company.
21
T E C H FA S T LY | A P R I L 2 0 2 1
e tool 22
T E C H FA S T LY | A P R I L 2 0 2 1
1) You do not have to build anything The tool vendor has put their energy and best people to build an ETL tool that they hope your organization can’t live without. As the vendor focuses to build an ETL tool, you and your organization can now focus on other business problems. Another consideration to buy the tool is that as teams change and people move on, the vendor from whom you have purchased the tool will have everything about your ETL documented clearly. Also, your data scientists and engineers have ensured that tomorrow’s developers have everything they required for the future. Using ETL tools means not wasting time over documentation, eliminating a potential, and counterproductive lack of clarity in case, new hires are responsible for a customized data integration solution not built by them.
2) Completely justified price Even if you and your data team are comfortable writing scripts, is it the most efficient use of the company time? Though building an ETL tool is simple and faster with the initial outlay of funds than making up for itself for large or complex projects in the future, its true potential may only be revealed after you start working with it. The speed of insight depends on the velocity towards its accessibility. So, using a tool relieves your responsibility for some aspects of data security and data integrity, with a data lineage documentation auto-generated from a metadata repository.
3) ELT is scalable and data flow is managed Using a tool that facilitated teamwork, makes it easier to share and transfer knowledge throughout an enterprise. Managed automation transforms multiple data flows into an actionable visualization layer that allows the end-users to query data without having to understand how any of it works. Also, when opting to buy, any professional tool should have no problem scaling up as your organization grows. Also, handing additional data sources and increased data volumes without compromising performance can be achieved by purchasing the tool. Other advantages include reduced potential errors while using the tool, which helps shorten the development cycle. It is user-friendly with smart features such as drag-and-drop. However, it may need regular updates and their out-of-the-box features can sometimes restrict them.
23
T E C H FA S T LY | A P R I L 2 0 2 1
CONCLUSION When it comes to data integration, defining the unique business needs of an enterprise or organization helps to bring the focus on the decision of build vs. buy. There is convenience in buying as you have it assembled for you by the pros who know what they are doing compared to the control inherent in building it yourself, which includes the native knowledge of how everything works, as you are the one who oversaw the construction. Eventually, you should choose an ETL process based on your end goal that is getting data into a state where users can derive insights. Having this in place is a critical step for the growth of your company based on business intelligence. Your organization can benefit from replacing ETL with ELT as it is intended for analysts, data engineers, and data scientists whose work directly involves or depends on the data pipelines. If ELT simplified data engineering, why should you buy a tool instead of building one from scratch to perfectly suit your needs? The reason is the time needed for building the tool from the scratch would demand 45 person-weeks that would cost a whole year salary of a full-time data engineer and its downtime related to maintenance. Additionally, the data engineer is unlikely to be an expert in the idiosyncrasies of each specific data source. Some APIs may be extremely complex, poorly documented, or ignore best practices. You should also consider the effect of such work on the morale of analysts and data scientists as the data pipelines are considered tedious and thankless to build but end up among people who would rather do other things. So, buying the tool can help to solve all these problems with a standardized solution that performs its job with minimal supervision. Buying the tool helps you to manage the processing requirements better, save more time, and increase developer efficiency. Diyotta is an ELT tool that provides effortless data integration solutions for analytics teams. It has code-free data integration that can be used by everyone. This tool allows you to have all your data in one reliable place and in real-time at an unbeatable price and customer service. It can be deployed in minutes, easy to use with no code or engineering required, and flexible to connect all your data at an affordable cost.
24
T E C H FA S T LY | A P R I L 2 0 2 1
Cloud Computing Ashlin Jenifa
in
Health Care Industry
25
T E C H FA S T LY | A P R I L 2 0 2 1
The vast impact of digital transformation in the health care industry makes its future firm. What’s interesting in cloud computing with healthcare? Cloud computing changes the traditional way of dealing with data. Big data analytics with cloud computing help enterprises manage the patient data, make the best healthcare decisions, lessens operational costs, better collaboration, and so on. Healthcare providers worldwide have understood the real power of cloud-based healthcare solutions. The necessary applications run in the cloud to make the hospital operations run seamlessly. It retrieves frameworks from unstructured data to help doctors diagnose without a glitch and allow patients to view their reports & prescriptions remotely. The data is stored and secured in the cloud. Cloud delivers hassle-free functional, operational, and economic advantages for everyone, including patients, staff, healthcare professionals, and healthcare companies. Let’s analyze the healthcare industry’s transformation in the areas such as patient care, data security, diagnosis, and more with cloud computing solutions.
26
T E C H FA S T LY | A P R I L 2 0 2 1
Follow Healthcare Policies in Security Cloud-based health care solutions are more efficient and secured when compared with physical records. The cloud’s stored data can be accessed only by the authoritative healthcare providers whenever needed to avoid data leakage and unnecessary situations. Famous cloud platforms such as AWS, Azure, and GCP follow healthcare standards and their business partners contingent to the U.S. Health Insurance Portability and Accountability Act of 1996 (HIPAA). They use the protected cloud provider environment to access, process, maintain and store health reports. The process ensures that the patient data is protected and not used without necessity.
The Revolution of Healthcare Industry with Cloud Computing There are two simultaneous benefits a healthcare firm acquires when shifting to the cloud. Both the patients and the health care workers get the benefits. It is proven that the cloud computing process has reduced operational costs and makes healthcare professionals deliver top-notch personalized care. At the same time, the cloud elevates patient engagement with their health strategies. It allows patients to access their information and ensuing enhanced patient results. The healthcare information and its remote accessibility liberate providers and patients in many ways and eliminate location barriers limiting access to healthcare. A few of the benefits of cloud computing in the healthcare industry are as follows:
01
Enhanced Scalability
Apart from being compelling and authentic, cloud-based healthcare solutions are also robust and scalable. As per the needs, cloud solutions give the flexibility to scale up/down infrastructure to fit in new data. Moving to the cloud might be the right solution for unexpected situations; the best example is the current pandemic/COVID-19. A large number of hospitals use AWS load balancing and auto-scaling services for their trusted security and efficiency.
02
Build Collaboration
Regardless of the source or storage, the collaboration aims at setting up data integrations all over the healthcare system. Subsequently, patient data is available anytime for sharing and getting 27
T E C H FA S T LY | A P R I L 2 0 2 1
knowledge to prepare the healthcare planning and delivery through interoperability elevated by cloud adoption. Cloud computing enables healthcare providers to access the patient data merged from multiple resources, distribute it with the primary stakeholders, and provide prescriptions on time and treatment guidelines. It also reduces the distance between the professionals, enabling them to review cases and provide their suggestions irrespective of the topological restrictions. Various segments such as the healthcare industry- pharmaceuticals, insurance, and payments, get together with the cloud’s collaboration mechanism, which allows the patient data to be accessible from anywhere, anytime.
03
Improved Accessibility
The cloud’s stored data delivers remote accessibility, which is considered one of the most important benefits. Cloud computing and healthcare is a good pair that has the power to increase and enhance many healthcare-related operations such as posthospitalization care plans, telemedicine, and virtual medication attachment. It also improves access to healthcare services. Telemedicine apps add the attributes of convenience to healthcare delivery and upgrade the patients with a seamless experience. Cloud-based telehealth applications and processes enable easy sharing of healthcare information, enhance accessibility, and offer healthcare coverage to the patients during the recovery phase.
28
T E C H FA S T LY | A P R I L 2 0 2 1
Healthcare providers worldwide have understood the real power of cloud-based healthcare solutions.
29
T E C H FA S T LY | A P R I L 2 0 2 1
At the same time, the cloud elevates patient engagement with their health strategies. It allows patients to access their information and ensuing enhanced patient results. The healthcare information and its remote accessibility liberate providers and patients in many ways and eliminate location barriers
30
T E C H FA S T LY | A P R I L 2 0 2 1
Artificial Intelligence and Machine Learning Artificial Intelligence and Machine Learning influence the healthcare industry to a greater extent. With healthcare professionals’ busy schedules, now due to the global pandemic, complexities, and increased information, the healthcare industry adheres to AI and ML techniques. AI ML capabilities could be an essential solution to support clinical decisions and, ultimately, a quicker time-totreatment. As cloud platforms use AI and ML in their services, cloud computing can support the transformation of artificial intelligence into general healthcare functions and help professionals manage vast amounts of data.
Better Decision Making Healthcare enterprises offer machine learning approaches based on cloud solutions for accurate and knowledgeable diagnosis for clinical and medical data analytics. Cloud-based solutions deal with large volume of structured and unstructured data in the healthcare repositories. The predictive models of the cloud solutions make way for better decision making.
Security is a Significant Concern One of the most critical challenges the healthcare industry faces is security. How secure is it is to have all the applications and patient information in a third-party environment? Healthcare information requires confidentiality. The massive data in the hospital domain may be attacked by hackers or malicious actors, resulting in security and data breaches. The cloud network guarantees security as it has particular security tools that can inform you about malicious inputs. Cloud servers help protect the healthcare reports. If any machine fails, medical institutions may lose all their information and applications in terms of on-premises solutions. Cloud-based security solutions save users from these types of potential risks.
Scalability and Flexibility Healthcare organizations function in a high-powered environment. Cloud-based technologies used in healthcare areas such as electronic medical records, mobile applications, patient portals, IoT devices, big data analytics yields better scalability and flexibility. It delivers problem-free applications, which enhances the decision-making process. Healthcare providers must measure the data storage capacity and network requirements based on the service necessities or demands. Cloud technology can scale up/down these storage requirements as per the needs of healthcare professionals. As cloud migration can completely eradicate your conventional approaches to data handling, a well-organized strategy for the migration process is required. Cloud migration strategy not only removes risks but also reduces the chances of downtime, prevents data leaks, enhances data handling, and strengthens security standards.
31
T E C H FA S T LY | A P R I L 2 0 2 1
Summary Cloud computing has occupied a significant portion of the healthcare sector. The blend of cloud computing with Artificial Intelligence, Machine Learning, Big Data Analytics, IOT in healthcare makes seamless business outcomes. It improves resource availability, enhances interoperability while reducing costs. With multiple inherited benefits to overcome healthcare challenges, cloud computing is the best option to opt for. It also enables stakeholders over the entire health IT space vendors, providers, insurers, and patients to embrace the cloud’s advantages securely, flexibly, and cost-effectively.
32
T E C H FA S T LY | A P R I L 2 0 2 1
Why is
E L T
better for Cloud Data Warehousing? Anjali Prabhanjanan
33
T E C H FA S T LY | A P R I L 2 0 2 1
Introduction:
Modern data warehousing such as cloud and other hybrid environments provides integrated machine learning solutions to store data that enables both customer insights and business intelligence (BI) to help make faster business decisions. Data warehousing refers to collecting data from several different sources, such as business applications, mobile data, and social media. Then this data is used to deliver valuable business insights and analytical reports. The heterogeneous data obtained from various sources are first cleansed and then organized into a consolidated format in the data warehouse. Enterprises and organizations use data warehousing tools and Database Management Systems (DBMS) to access the data stored on the warehouse servers for supporting their business operational decisions.
34
T E C H FA S T LY | A P R I L 2 0 2 1
T
he market growth of data warehousing is attributed to factors such as the increasing amount of data generated by enterprises and a growing need for BI to gain a competitive advantage. The huge volumes of data produced by different businesses are exerting tremendous pressure on their existing resources, thus forcing them to adopt data warehousing solutions for flexible, efficient, and scalable storage. This data can be leveraged using advanced data mining and BI tools to provide valuable business insights to users for strengthening their customer retention, increasing their operational efficiency, help in better decision-making, and increase their revenue streams. Storing data on-premises can become quite expensive if the computing power and storage have different scalability. Data-driven organizations are the early birds to swap the traditional on-premises with cloud data warehouses that are more agile and flexible. That’s because the latter can instantly scale themselves to deliver high or low computing needs as required, thus, making them highly cost-effective. Such companies are finding that their ETL (Extract Transform Load) tools are less adaptable to the hybrid environments, and demand more upgrades, ultimately 35
T E C H FA S T LY | A P R I L 2 0 2 1
makes them turn the costlier investment in the long run. The scalability and computing power of ETL is proving to be Achilles’ heel. Cisco reports that 94% of all workloads will run in some form of the cloud environment by 2021. [1] Numerous business enterprises are now considering the benefits of cloud data warehousing which include on-demand computing, multiple data type support, flexible pricing models, integrated BI tools, and unlimited storage. SMEs (Small and Medium Enterprises) are rapidly adopting the cloud warehousing model due to low infrastructure needs and affordable costs. As the last decade saw a rapid growth of cloud adoption rates across many industries. Today, cloud data storage accounts for 45% of all enterprise data, and the number could grow to 53% by Q2 of 2021. [2]
T
he available cloud data warehouses that allow companies to manage their analytical data by storing and cost-effectively processing data are changing. The shift from the on-premises servers toward the could data warehouses is sparking a shift from ETL to ELT (Extract Load Transform). ELT is an alternative technique compared to the traditional ETL, as this process involves pushing the transformation component of the process to the target database for better performance. This capability is quite useful to process the massive data sets required for big data analytics and business intelligence (BI). Let us now discuss an efficient process to move and transform data for analysis, crucial for business growth and innovation in this data-driven world.
Loading a data warehouse can be an extremely timeconsuming process. ELT is the process of extracting data from one or multiple sources, load it into a target data warehouse, instead of transforming the data before it is written.
T
he ELT process involves streamlining the tasks of modern data warehousing and managing big data so that businesses can focus on data mining for actionable insights. ELT capitalize on the target system that does the data transformation. With ELT needing only the raw and unprepared data, this approach needs fewer remote sources than other techniques. ELT reduces the time, data spends in transit and boosts efficiency, as it takes advantage of the processing capability that is built into a data storage infrastructure. Though this process has been in practice for some time now, it is now gaining popularity with the more widespread use of Hadoop and cloud-native data lakes.
Whatis ELT? It stands for extract, load, and transform- the processes a data pipeline uses for replicating the data from a source system into a target system such as a cloud data warehouse. 1. Extraction is the first step in which data is copied from the source system. 2. Loading is the next step, where the pipeline replicates data from the source into the target system that could be a data lake or a data warehouse. 3. Transformation is where once the data is in the target system, the organizations can run whatever transformations they need. Usually, they will transform the raw data in different ways for use with different tools or business processes.
36
T E C H FA S T LY | A P R I L 2 0 2 1
ELT and its Infrastructure ELT is a modern variation on the older ETL process in which the transformations take place before loading the data. Running the transformations before the load phase results in a more complex data replication process. ETL tools need processing engines to run the transformations before loading the data into a destination. While with ELT, businesses use the processing engines in the destinations for efficiently transforming data within the target system. So, the removal of an intermediate step streamlines the data loading process.
Use case of ELT As the ETL transforms data before the loading stage, it is an ideal process when the destination needs a specific data format. This could include when there’s a misalignment in the supported data types between the source and destination, or due to limited ability to scale processing in a destination quickly, or security restrictions that make it impossible to store the raw data in a destination. Liv Up, the Brazilian food tech start-up provides an example of the effectiveness of ELT in data warehouses. The company integrated data from a variety of sources, such as Google Analytics, MongoDB, and Zendesk into its data warehouse. Though the process was effective in the past it was a bit cumbersome. Data from MongoDB was quite challenging as it needed to translate NoSQL data into a relational data structure. The company took about a month to write the code for their data pipeline with traditional ETL. That’s when they started to look for reducing the time to value and get their data to its destination more quickly. So, the company turned to Stitch, a cloud-first, developer-focused platform that helps to expedite the data replication process. Acquired by Talend in November 2018, the company operates as an independent business unit. Hundreds of data teams rely on them to securely and reliably move their data from the SaaS (Software-as-a-Service) tools and databases into the data lakes and data warehouses. It took about 8 hours a week to extract and load the company’s data for Stitch. Liv Up benefited from the ability to build their transformation phase to easily leverage the BI tools that were integral to their company. ELT is a better approach when the destination is a cloud-native data warehouse like Google BigQuery, Amazon, Snowflake, Microsoft Azure SQL Data Warehouse, and Redshift. That’s because these organizations can transform their raw data at any time, when and as required for their use case, and not as a step in the data pipeline.
37
T E C H FA S T LY | A P R I L 2 0 2 1
Numerous business enterprises are now considering the benefits of cloud data warehousing
38
T E C H FA S T LY | A P R I L 2 0 2 1
ELT is a better approach when the destination is a cloudnative data warehouse like Google BigQuery, Amazon, Snowflake, Microsoft Azure SQL Data Warehouse, and Redshift. 39
T E C H FA S T LY | A P R I L 2 0 2 1
A FEW POPULAR CLOUD DATA WAREHOUSES: 1. Amazon Redshift The approach of Amazon Redshift is a PaaS (Platformas-a-Service) that is highly scalable. It has the provision the clusters of nodes to customers as their computing and storage needs evolve. Each node has an individual CPU, RAM, and storage space. To set up Redshift, you must provision the clusters through AWS (Amazon Web Services). Redshift lets its users automatically add clusters in times of high demand.
2. Google BigQuery The best thing about the architecture of BigQuery is that you don’t need to know anything about it. It is server less, that’s why its underlying architecture is hidden from the users (in a good way). It can scale to thousands of machines by structuring computations as an execution tree. BigQuery sends queries through a root server, intermediate servers, and finally leaf servers with local storage.
3. Microsoft Azure SQL Database
(massively parallel processing) architecture that is designed to handle multiple operations simultaneously by several processing units that work independently and have their own dedicated memory and operating system. It collects data from databases and SaaS platforms into one powerful, fully-managed centralized repository. The compute and storage as billed separately, so they can scale independently.
4. Snowflake Snowflake is a DWaaS (Data Warehouse-as-a-Service) that operates across multiple clouds including AWS, Microsoft Azure, and soon, Google Cloud. It separates the storage, computes, and services into detached layers, allowing them to scale independently. These automatically managed storage layers can hold structured or semi-structured data. The compute layer contains clusters, each of which can access all data but work independently and concurrently to enable automatic scaling, distribution, and rebalancing.
Azure SQL Data Warehouse is an elastic, large-scale DWaaS that leverages the broad ecosystem of SQL Server. It uses a distributed MPP
40
T E C H FA S T LY | A P R I L 2 0 2 1
ELT ADVANTAGES FOR BUSINESSES The explosion in the types and volume of data to be processed by businesses can put a strain on the traditional data warehouses. Using an ETL process to manage millions of records in new formats can be quite expensive and time-consuming. This is where ELT offers numerous advantages over ETL:
01
Faster Time to Value
Generally, ELT provides a faster time to value which means business intelligence is available far more quickly. Whereas, ETL needs a timeintensive and resource-heavy transformation step before loading or integrating data.
04
Simplifies management
It separates the loading and transformation tasks, lowers the risk, minimizes the interdependencies between the processes, and streamlines the project management.
07
Lowers the cost
Cloud-based ELT can result in a lower total cost of ownership, as an upfront investment in hardware is often unnecessary.
02
Scalability
ELT tools are used along with the cloud data warehouses which are designed to autoscale in the case of increased processing loads. The cloud platforms allow for almost unlimited scale that too within seconds or minutes, while the older generations of onpremises data warehouses need the organizations to order, install, and configure new hardware.
05
Leverages the latest technologies It harnesses the power of new technologies for pushing improvements, compliance, and security across the enterprise. It also leverages the native capabilities of the modern big data processing frameworks and cloud data warehouses.
03
Flexibility
You can replicate the raw data into your data lake or data warehouse and transform it when and however you need them with the ELT process as it is adaptable and flexible. This makes it suitable for a wide variety of businesses, goals, and applications.
06
Future-proofs data sets
ELT implementations can be directly used for data warehousing systems. However, most of the time it is used in the data lake approach where data is collected from a range of sources, combined with the separation of the transformation process to make it easier for making future changes to the warehouse structure.
Though ELT is still evolving, it offers the promise of unlimited access to data, less development time, and significant cost savings, thus, it redefines data integration.
41
T E C H FA S T LY | A P R I L 2 0 2 1
HOW ELT WORKS? It is becoming increasingly common for data extraction from its source locations to get them loaded into a target data warehouse for transforming them into actionable business intelligence. This process is called ELT and involves the following steps:
1.
Extract
This step is the same in both ETL and ELT data management approaches. The raw streams of data from virtual infrastructure, software, and applications are consumed in either their entireness or as per predefined rules.
2.
Load
This is where the ELT differs from its earlier cousin ETL. Instead of delivering the raw data and loading it to a temporary processing server for transformation, ELT delivers the data directly to the destination storage. This helps to shorten the cycle between the extraction and delivery of the data.
3.
Transform
The data warehouse sorts and normalizes the data and keeps part or all of it accessible for customized reporting. The overhead for storing a huge amount of data is higher, but it offers more opportunities for data mining for relevant BI in near real-time.
42
T E C H FA S T LY | A P R I L 2 0 2 1
It is becoming increasingly common for data extraction from its source locations to get them loaded into a target data warehouse for transforming them into actionable business intelligence.
43
T E C H FA S T LY | A P R I L 2 0 2 1
44
T E C H FA S T LY | A P R I L 2 0 2 1
WHY IS ELT BETTER? Let us take a closer look at the difference between ETL and ELT processes. The primary difference between ETL and ELT is the amount of data retained in the warehouses and where the data is transformed. In ETL, the transformation of data is done before loading into a warehouse, which enables the analysts and business users to get the data they need faster. Moreover, they don’t have to build complex transformations or persistent tables in their business intelligence tools. Whereas, in ELT, the data is loaded into the warehouse or data lake as it is, without transformation before loading. This makes it an easier job for configuration as it only needs a source and a destination. The ETL and ELT approaches for data integration differs in the following ways: 1. Load Time It takes significantly longer to get data from the source system to the target system with ETL, whereas it is faster with ELT.
5. Maintenance ETL requires significant maintenance for updating data in the warehouse, whereas, with ELT, data is always available in near real-time.
Conclusion Both the ELT and ETL processes have a place in today’s competitive scenario. Understanding the unique needs and strategies of a business is the key to determine which process will deliver the best outcomes. Businesses need a data warehouse to analyze data over time and deliver actionable BI. So, should you deploy your data warehouse on-premises at your own data center or in the cloud? The answer depends on factors like cost, scalability, control, resources, and security. Some businesses may deploy a data warehouse on-premises, in the cloud, or a hybrid solution that combines both. An organization choosing an on-premises data warehouse must purchase, deploy, and maintain all the hardware and software.
2. Transformation Time ELT performs data transformation on-demand, using the computing power of the target system, which significantly However, as a cloud data warehouse is reduces the wait times for transformation a SaaS, having no physical hardware, a as with ETL. business will pay for the storage space and the computing power they need 3. Data Warehouse Support at a given time. A business pays for the ETL is a better approach for legacy storage space and the computing power on-premise data warehouses and they need at a given time. Scalability is structured data, while ELT is designed for simply adding more cloud resources, the scalability of the cloud. and there’s no need to employ people or maintain the system as those tasks 4. Complexity are handled by the provider. That’s Typically, the ETL tools have an easywhy cloud-based data warehouses to-use GUI (Graphic User Interface) that and ELT go hand-in-hand with regards simplifies the process. While ELT needs to performance, scalability, and lower in-depth knowledge of BI tools, masses costs as compared to the on-premise of raw data, and a database that can databases and ETL. transform the data effectively. 45
T E C H FA S T LY | A P R I L 2 0 2 1
46
T E C H FA S T LY | A P R I L 2 0 2 1
The
Future of Wealth
Management
Tech or Hybrid? Barkha Sheth
47
T E C H FA S T LY | A P R I L 2 0 2 1
With futurists looking to re-invent the asset management value chain and consumers demanding more clientcentricity, can tomorrow’s wealth management solution be led solely by tech? Digitalisation is promising a massive upgrade to long-term financial planning. What can this mean for you as an investor? An investor usually has two broad objectives: to grow their capital and, second, to protect what they already have. But such seemingly simple goals can face some tirelessly complex hurdles. From trying to understand asset classes and keeping track of portfolio performance to changing investment buckets based on market trends, an average investor often has a lot to contend with. Combine this with increased uncertainty and the rising cost of risk, and you have a consumer who is frayed at the edges and is demanding a lot more in-the-way of client-centricity. As per a report published by Deloitte in 2017, we are about to witness the largest wealth transfer in history over the next 40 years, with almost USD 50 trillion changing hands from one generation to another. This makes it even more essential for institutions to gear up to address a re-wired Millenials’ demands. With increased globalisation, it’s vital to recognise that digital transformation is no longer a choice but an imperative for sustainability. This is something wealth managers have now realised.
48
T E C H FA S T LY | A P R I L 2 0 2 1
Risking disruption to reinvent their value-chain, wealth management is now following in the footsteps of consumer-centric organisations like Amazon, Apple and Netflix, to name a few; to empower a digitallysavvy investor, with tools and methodologies that up until now was the purview of a select few.
The emergence of Wealth Tech Just as fintech transformed financial services, Wealthtech promises a revolution in the investment management industry. It encompasses a range of different channels - from B2C robo-advisors to micro-investing platforms that allow users to regularly save small sums of money to even B2B providers that offer digital brokerages and wealth advisory. Wealthtech is attempting to create a fully operational wealth management bank ‘in your pocket’. With cutting edge technology, firms like Wealthfront, Addepar and InvestCloud are helping investors benefit from Artificial Intelligence (AI) and deep-data driven analytics to receive real-time personal financial management at a fraction of the cost.
49
T E C H FA S T LY | A P R I L 2 0 2 1
No Longer A One-Size-Fits-All Thanks to Machine Learning, advisory firms are infusing data and analytics to serve clients holistically by generating relevant, timely, and actionable insights for clients and advisors alike. Embedded at the heart of Wealthtech is a behaviour-based segmentation based on risk appetite, interests, and financial goals. This brings a more intuitive experience to the investor, going beyond just recommending portfolio allocation based on risk profile or asset under management (AUM). Instead, more customised offerings are made available based on client personas and interests. With this outside-in approach, technology is bridging the chasm between business objectives and customer expectations.
financial planning through goal-based bucket creation to set aside a long-term retirement corpus while having an aggressive short-term bucket for saving up for a down payment of a house. If there are changes in market sentiments in the short-term, the investor can shift money from one bucket to another. This is perhaps the most significant take back from this value-chain: more control, better individualisation and better expectation management. Taking it a step further, a personalised dashboard gives you 24/7 access to your portfolio with the ability to compare performances across different asset classes ranging from commodities, mutual funds, energy, real estate, and even art. But where does this leave the traditional human element of financial planning advisory?
Taking Robo-Advisory To The Next Level But as consumer sentiment evolves, the new generation of wealth customer is looking beyond just growing their capital. With urbanised goals like sustaining a lifestyle, education planning for kids, healthcare costs of ageing parents, or even buying a new home, today’s consumer needs more appropriate real-time advice than ever before. This is perhaps where traditional wealth management has sometimes fallen short of its objectives. With analyst, stock picks undermining market indexes more often than not and underdog plays making headline-worthy moves, advisors often find themselves dragging their feet when it comes to generating timely advice and meeting client goals. For Wealthtech innovators, this is a core focus area - bringing granular level needs-based advisory, real-time, to the mass market. By integrating economic trends and even social media analytics, whose singular stand in recent times has moved markets beyond measure, this type of advice becomes more relevant than ever. Combining this with click-of-the-button UI, Wealthtech brings in user-friendly omnichannel support to simplify investing and at the same time give control back to the investor. For instance, an investor can individualise their 50
T E C H FA S T LY | A P R I L 2 0 2 1
The Convergence Of Tech And The Human Element Wealth management has been a long-standing stalwart of a personal advisory model. But algorithms are slowly taking some of the more fundamental functions of human advisors. However, that doesn’t entirely render the human element obsolete. On the other hand, it augments the offerings while still retaining the traditional model’s personal and emotional component through a more hybrid approach. Human advisory’s significance is invaluable for clients with material assets to invest in, or even when it comes to tax planning and estate management. This is why the digitalisation of the Wealth Management space is so unique. The pairing of the human relationship with automation to create an omnichannel service makes financial planning a lot more approachable for the masses.
Wealth Management 2.0 The paradigm shift in the wealth management landscape is occurring rapidly, and these digitalisation trends bear testament to the massive change that the wealth management space is facing. With the gamification of UI, the enormous shift in demographics now controlling wealth and the new and digitally rewired customer at the helm, Wealthtech seeks to bring far-reaching changes to a system that up until was the purview of a select few. We, for one, are excited at the potential of this change.
Welcome to Wealth Management 2.0.
51
T E C H FA S T LY | A P R I L 2 0 2 1
Arnab
PANDEY 52
T E C H FA S T LY | A P R I L 2 0 2 1
A
rnab Pandey is the Senior Analyst of Fixed Income Derivatives and FX at Raymond James Financial. He also serves as a board member of the D&I council for Fixed Income at
Raymond James. He did his bachelor’s in engineering specializing in Applied Mechanics and Biomedical Sciences. He found out his passion for numbers when he got the opportunity to study statistics online from a course offered by Harvard University. He later did his MBA with a concentration in Finance from UT and an online executive certification in Finance (valuation) from the Harvard Business School.
He is a passionate data analyst, and his love towards statistics and technology has led him to research and learn about unique investment strategies and quantitative analysis. He was awarded the ‘Best in Business’ honor during his MBA studies by the Beta Gamma Sigma Honor Society. He has led several global teams to solve some of the biggest economic challenges around the world. One of his research on India’s socio-economic development is supported by 35 nations globally and has appeared in national newspapers in India. He has significant exposure in the field of Private Equity, Buy-side Investment banking, capital markets, and compliance. Apart from his work in the capital markets, he is a nationally awarded debater (Parliamentary Award by the Government of India), an internationally acclaimed researcher, and a passionate blogger. He is also a columnist in various business magazines. He was nominated for Padma Shri award, one of the highest civilian honors of India, for 2021. 53
T E C H FA S T LY | A P R I L 2 0 2 1
We discuss with him about capital markets and how technology is changing the landscape. Hello, Arnab. Thank you for joining us today. We are excited to learn more about how technology is impacting the capital market. Thank you for having me. Let’s start with the first question. Technology is now closely entwined in the business of capital markets than ever before. The COVID-19 pandemic accelerated the digital and cloud transformation in the finance sector. What are the major challenges you believe this sector is facing due to the sudden need for digital transformation and cloud platform-based delivery models?
I
believe technology has always played a crucial role in capital markets since its inception. But the key factor to understand here is the rate of technological acceptance in the financial sector, as technology evolved through the years. If you analyze, you will see that it has been rather very slow. COVID-19 has literally forced institutions to re-think digitalization strategies
increasing the cost of implementation at higher levels. A study by BDO states that about 70% of financial services executives feel that lack of proper training and the required skill sets are the biggest challenge to a new digital initiative. Due to the size of the banking/financial services business, the change takes time. It is not just about the fundamental technology that lags behind but also about the organization’s digital culture. A lot of times, we see that departments in financial institutions work in silos. Each department has its own way of using a platform or system, they have their own procedure that might not be optimal at all, but they still want to use it. There is no centralized mechanism through which you can educate people to use systems, especially when employees are too comfortable using the legacy systems; training them with motivation becomes extremely difficult. A recent study by the Boston Consulting
One of his research on India’s socioeconomic development is supported by 35 nations globally and has appeared in national newspapers in India. without providing a lot of time to think about them. We often see institutions struggling due to that. While a few institutions successfully implemented digital changes, many of the financial institutions have faced problems due to the inheritance of complex infrastructure due to M&A activities. That literally raises the debate of systems integrations – legacy versus new
54
T E C H FA S T LY | A P R I L 2 0 2 1
Group explains that more than 80% of the digital transformation initiatives fail due to a lack of digital culture.
55
T E C H FA S T LY | A P R I L 2 0 2 1
In 2021, as financial entities decide on a short-to-medium term business strategy, technology investment decisions will hinge on multiple considerations. What will provide the greatest impact at both inter-and intra-firm levels? What will best address each entity’s technical state and organizational culture?
I
n my opinion, even before talking about short or medium-term business strategy revolving about technology investments in the financial sector, firms have to understand that they are going in for the long-term gain. As I mentioned earlier, most of these technology projects’ ROI might not be noticeable in the short term. One has to project the future cash flows for a significant period of time to understand the net present value of these projects. Most of the time, these are positive values. People have to understand that digitization is not and will never be equivalent to digitalization. Ever.
(especially operations) with technology. Empowering employees with the right technology, knowledge, and support will lead to a change in technological acceptance, boost morale and enhance communication and interaction both internally (to employees) and externally (to customers). This will lead to delivering faster results and better decisionmaking because digital organizations tend to be less hierarchical. Employees will prioritize making the customers’ experience better, as digital organizations use a closed-loop feedback mechanism.
Empowered operations staff will always put them in the customer’s perspective for effective feedback because indirectly, that will make their lives easier and improve their performance. The mindset shift from the top to bottom level employees in an organization to make digitalization a part of their everyday core functions is the need of Then comes the most crucial part for any the hour. From the board members, to the organization to succeed in digital transformation, risk management teams, everyone should and that is establishing a digital culture. To be cognitive about what technology they achieve this, the first step could be to simplify are using and what are the risk and reward the legacy systems and start opting for artificial towards each of the systems and modules. intelligence, process automation, robotics Even for the people in the front office and technology, SaaS infrastructure, and data mining business in general, the over-reliance on a models that can read through unstructured software vendor or IT department for each of pieces of data. I know that the industry right now their technical problems is a thing of the past. is pretty far from all this, but change is coming, A comprehensive technology school should and we all have to embrace it; otherwise, we be established as a part of the digitalization might become extinct. program, where the right talent could teach and lead the technology innovation across the Digital culture in an organization does not really organization through knowledge transfer and mean just having digital products, services, and hands-on training. interactions via digital platforms; it also means empowering employees at all levels (especially operations) with technology. Empowering employees with the right technology, knowledge, and support will lead to a change in technological acceptance, boost morale and enhance communication and interaction both internally (to employees) and externally (to customers). This will lead to delivering faster results and better decision-making because digital organizations tend to be less hierarchical. Employees will prioritize making the customers’ experience better, as digital organizations use a closed-loop feedback mechanism. Empowered operations staff will always put them in the customer’s perspective for effective feedback because indirectly, that will make their lives 56
T E C H FA S T LY | A P R I L 2 0 2 1
This year has accelerated innovation to reduce risk and improve resilience across all sectors, but how did technology influence innovation in capital markets?
Technologies such as distributive ledger mechanism (blockchain) have great potential towards making processes transparent, secured, and more error-free than ever before. Someone’s output is someone else’s apital Markets, in general, has seen input, and if there is a mismatch, you cannot unprecedented growth over the proceed further without fixing the error or last five years in commemorating without agreeing with a consensus. No central different kinds of technology and validation is required, and cryptography digitalization techniques. The market enhances the authenticity of each block. is also looking forward to some fascinating What does that mean if you introduce this upcoming amalgamation of mathematics mechanism in financial operations? and tech-based products and algorithms at its disposal. I find Reduced costs of errors, it really interesting to talk about sure shot outputs, increased technology and capital markets, empowerment, trust, and can talk for hours, but for the transparency, and structure in sake of the reader, I will keep it the processes. short and crisp. The middle office and back office heavily rely on Robotic One of the most disruptive and Process Automation (RPA) important innovations that the and Artificial Intelligence markets have experienced is (AI) nowadays. RPA has the cloud computing. The innovation ability to eliminate human in this space is amazing, and error in structured, repetitive the choices are vast because jobs, enhance speed, reduce the FinTech industry is literally operations cost, and increase competing with each other in this space for profitability by quantifying the amount of jobs market share. Competition in the market and done in a period of time. Now the question cloud providers embracing open-source is, what if you need decision-making – can solutions together with their own proprietary RPA alone do the job? No. But if you introduce software is good for the consumers. Services artificial intelligence to the RPA, you might be and products offered become less and less able to train your algorithm to parse through expensive. One of the interesting advancements unstructured data as well. The algorithm can be in the cloud computing arena in capital markets trained in such a manner that the robot learns has been the integration of time-series and adapts to the process autonomously and databases with parallel streaming of activities then see the results. You will be amazed! in milliseconds to viewers linked to exchanges such as Nasdaq, NYSE, etc. This has literally Did someone say Smart Contracts! Heaps of changed how companies perform and view paper-based legal documentation can be very activities in the markets. It is one click away from well replaced with Smart Contracts. your phone.
C
One of the most disruptive and important innovations that the markets have Experienced is cloud computing.
57
T E C H FA S T LY | A P R I L 2 0 2 1
Smart contracts are basically programs that verify, facilitate or negotiate the performance of a contract without themselves being legal contracts. Distributed ledgers might be enhanced with smart contracts, and there is research going on as we speak if that system can mimic service agreements and credit support annexes in the derivatives world. If that happens, it will provide extreme transparency over contractual terms between dealers and counterparties in the markets. But yes, Smart Contracts can only work on black and white structured data, so it might fail when decision making over a contract is needed. So, there are some risks associated with it as well. I was reading one of the studies conducted by EY the other day, and it stated that the average amount of customer data that is not used by businesses is a staggering 80%. Think if we could access this unused data at what level of significance a predictive analytics module can project your outputs to? Very very high! It’s like you will be able to predict the future performance of a portfolio with an outstanding level of significance. Here comes Advanced Analytics – a process with which you can make precise and powerful decisions by using extensive data sets. It encompasses a number of techniques, but the most well-known are predictive analysis, sentimental analysis, behavioral analysis, and data visualization. The only problem with this mechanism is that it requires a huge amount of data to provide better outputs, and the cost of parsing useful data through structured and unstructured data is usually high. So, here is a place where we need innovation! Capital market firms are adopting to automate bond pricing and risk management by leveraging AI predictive analytics for better pricing and liquidity in the market. Do you think/ believe that AI-based algorithms can enhance bond trading performance for the firm?
Y
es, I cannot agree more to the fact that AI applications can significantly enhance bond trading performance in firms. AI can be extremely helpful because the bond market is considered less liquid than the stock market. There are various reasons such as trading over the phone, over the counter (OTC) trading, huge trading sizes of bonds that a trader can trade, and of course, the fact that there are no direct announcements of the current market price of bonds, to name a few. You can say that there is a liquidity issue with bonds. The factor of transparency in information between a stock and a bond is a matter of concern. You can see the market price of a stock almost in real-time; there is a 15-minute delay in actually accessing the market price on a bond. This is where AI can play a vital role in Bond price predictions. With big data analytics and machine learning algorithms, the price prediction in Bond desks can be significantly improved. Algorithms can utilize large historical data sets available and work on various variables such as trading pattern, trading history, issuer fundamentals, ratings, and other market economics. Although the algorithms can be designed to find trends in the data and transform unstructured raw data into usable structured data, it is imperative to employ machine learning techniques to improve data mining processes.
58
T E C H FA S T LY | A P R I L 2 0 2 1
improve data mining processes. Collaborative filtering systems and hybrid recommendation systems can be used to analyze the trader’s trading history and recommend to the trader as to which bond to buy and which to sell. Some hedge funds and financial institutions use chatbots to provide assistance to customers. The same technique can be designed using smart online platforms such as IBM Watson to create chatbots that can help traders as to which bond is a good buy and investment-worthy, thereby reducing the risk in the portfolio and providing greater risk-adjusted returns, and reduce human error. Machine learning algorithms can also be used for excellent interpolation and extrapolation in filling missing yields if the market is illiquid. Think about an algorithm that trains itself by observing hundreds and thousands of historical corporate bond yield curves. It can incorporate the idea of how the yield curve should be, so when there are missing points due to illiquidity in the market, it can interpolate or extrapolate the data using historical trends and predict the yield curve, thereby predicting the price of the bonds. There is a research paper that focuses explicitly on this – titled ‘Machine Learning for Yield Curve Feature Extraction: Application to Illiquid Corporate Bonds’, written by Greg Kirczenow, et al.
59
T E C H FA S T LY | A P R I L 2 0 2 1
60
T E C H FA S T LY | A P R I L 2 0 2 1
Yes, I cannot agree more to the fact that AI applications can
https://depositphotos.com/75992387/stock-photo-opened-oysters-with-lemon.html
significantly enhance bond trading performance in firms.
61
T E C H FA S T LY | A P R I L 2 0 2 1
Can you talk about quandamental investing, and what are the major challenges associated with it?
Q
uantamental Investing has pretty much become a buzzword nowadays, especially among hedge funds, wealth managers, and trading firms. Although it sounds very complicated, which it truly is, we can understand it by simply breaking the term Quantamental into two words – “Quantitative” and “Fundamental”. So essentially, the use of quantitative methods in deriving insights using extremely large amounts of market data and then applying the fundamental approach of investing in taking the final 62
T E C H FA S T LY | A P R I L 2 0 2 1
Quantitative strategies such as price arbitrage are used to scan the market, find out securities, and then compare them with similar securities and determine if there is any price difference.
decision of investing in Quantamental Investing. Quantitative investing refers to applying and using complex machine learning algorithms over a vast amount of structured and unstructured data available in the market to derive insights, patterns, trends, etc. As computers are involved in this case,
algorithms can be faster than humans error-free, and not biased like a human in analyzing trends and better predict brain. So decisions and options can be market movements or security pricing in generated within seconds which can the future. Quantitative strategies such be acted upon and get you better risk as price arbitrage are used to scan the adjusted returns. market, find out securities, and then compare them with similar securities Fundamental investing refers to the and determine if there is any price traditional bottom-to-top method of difference. If there is a price difference, stock picking by analyzing the balance you can capitalize on that differentiator, sheet, income statement, cash flows, known as arbitrage. Factor investing is ratings, etc., of a company and then another quantitative strategy where the deciding on the health of the company models/algorithms try to find out specific before investing. Although this process economic factors that could influence is one of the best processes to the pricing of security (macroeconomic pick stocks/bonds traditionally, this factor, microeconomic factor, style process can be slow, and it is nearly factors such as market cap, etc.). Again impossible to find out and capitalize if there is a difference in price among on opportunities that exist all over identical securities, you can capitalize the market as only human brains are on that. Algorithms are faster, more 63
T E C H FA S T LY | A P R I L 2 0 2 1
By merging the two processes and binding the computer’s computing and analytical power with the human’s rationale and decision-making ability, a really good ensemble could be created to get the best out of the market. That is your Quantamental Investing. There are some big Wall Street players and hedge funds which have already implemented Quantamental Investing in their approach, and have generated competitive returns. But with all the good comes the bad too. It is not that Quantamental Investing does not have its problems. First of all, you will need extremely large amount of data sets, to actually predict the markets. A lot of time, a huge amount of data sets can yield no results as they can be corrupted or not appropriately mined. That calls for the right data mining processes – which machine learning might solve with time. Then comes the problem of cognitive ability. An algorithm can analyze data and get you the trends, but the human needs to take the final decision. A human can make a wrong decision even after given the best alternatives. Machine learning algorithms with sound cognitive abilities to mimic the human brain are way far in the future. We also have to remember that humans build these algorithms and strategies, so the results derived by these algorithms are somehow linked with the developer’s intelligence and capabilities. Highly skilled analysts need to be hired to code these algorithms. Significant upgrades in internal systems are required to cope with the additional strain on resources caused due to big data analytics. Thus, this requires substantial investments even in the beginning stages of development. Thank you for taking time out of your busy schedule and join us for the interview. We wish you good luck in all your future endeavors.
64
T E C H FA S T LY | A P R I L 2 0 2 1
Apurva Minchekar
Technology and Stock Market: A Millennial’s Take 65
T E C H FA S T LY | A P R I L 2 0 2 1
Technology has impacted the lives of billions. With technology evolving every single day across industries, it isn’t easy to imagine a future without being surrounded by technologies. The penetration of technology has also influenced the stock market industry making it more efficient and helping consumers take leaps and bounds. According to experts, millennials have piled up in the trading industry and can fuel the market to disrupt the stocks. While in talks with Aditya Meera Sharma, a millennial in the trading industry, here is the analysis of how technology has
66
T E C H FA S T LY | A P R I L 2 0 2 1
Digital Trading Technology has drastically changed the way investors trade and have elevated digital trading. Digital trading has allowed small risktakers to invest because of high ROI, discounted brokers, and minimal risks. Anybody can start trading with INR 10,000 and can expect minimal losses compared to their brokerage. Brokers can now provide 4X leverage that allows small investors to buy four times their original capital. However, it is a significant benefit for dynamic risk profile traders to maximize their money quickly.
67
T E C H FA S T LY | A P R I L 2 0 2 1
Real-Time Stock Performance Analysis Technology has made selling and buying shares easier than before. With the help of technologies, investors are now aware of the latest stock updates and can track the market performance in real-time, making it easier to execute and stress-free trading and analysis. Recalling how Mr. Sharma use to help his mom keep an eye on stock charts on CNBC and watched his mom calling brokers to execute a buy position and exit position that always resulted in the delay. He added, “Today it’s all in front of the screen, brokers have their trading platform, and with a single click we can know the current market price of the stock, we can execute the orders, and it would take less than 2-3 second to get in to a trade or exit out”. Investors can also study the market trend before investing their hard-earned money in any company. It is also recorded that the errors in transactions have also reduced because of the advanced features of technologies.
68
T E C H FA S T LY | A P R I L 2 0 2 1
Sporadic Market Shift “Earn More” is the easiest trap to fall for. With everything becoming digital across industries, digital platforms are one of the ways to disseminate false information and mislead investors. It is needless to say that a message is forwarded within seconds, even before the individual reads the message entirely or verifies it. Mr. Sharma suggests choosing the right mentor is one of the ways to minimize such frauds. You must always ask the mentor to show their past year’s profit and loss statement before taking it forward. Trading cannot be skilled in just 2-3 sessions and so choose the mentor who would guide them for months about the ups and downs of the game.
69
T E C H FA S T LY | A P R I L 2 0 2 1
Development of Mobile Application Nowadays, many trading applications are available on Playstore, keeping the stock market details at investors’ fingertips. An investor can trade from anywhere anytime has increased the number of investors because of lesser restrictions. Unlike hefty trading fees that reduce investment gains, trading applications have reduced the fees, allowing investors to gain more wealth and even attract new investors to join the stock market industry. Many applications also teach and provide you with demo accounts to learn the trading basics.
70
T E C H FA S T LY | A P R I L 2 0 2 1
Artificial Intelligence Controlling your greed and fear in trading is not everyone’s cup of tea, and hence many people choose to simply code their strategies and let AI do trading on their behalf. The future of trading has more about AI takeover as it allows an individual to also focus on other chores; however, this type of trading style may not be accessible to everyone and will have to face difficulties in competing with the fastest computers compared to theirs’.
71
T E C H FA S T LY | A P R I L 2 0 2 1
Increased Investors During COVID-19 and yet, people have lost their jobs within seconds, reflecting that having a full-time job is also equally risky. You could be laid off anytime for reasons. Many people choose to trade full-time or part-time because of the money factor that they can earn more or less to their regular job income. The Internet is a library of all information that you want to learn about. The combination of the Internet and technology in the stock market industry has attracted more investors. According to reports, 19.5 percent are individual investors in 2020 that is double of 2010. Because the Internet is accessible to everyone, one can learn the basics of trade, and technology has proven to be a boon to the stock market industry. By following the old traditional way of trading, no one would have taken the effort to consider trading in the present.
72
T E C H FA S T LY | A P R I L 2 0 2 1
Technology has made the lives of human beings easier than anyone could have imagined. Everything can be performed at your fingertips error-free. With more advancements, it will effectively and efficiently shape the future of the stock industry. However, investment’s gain and loss are not the responsibility of technology, considering it is the only means to provide you with the market prediction. Rest assured is the test of your trading skills.
73
T E C H FA S T LY | A P R I L 2 0 2 1
The Computer in Your
Hands – How These Pioneers Reshaped
Our World Barkha Sheth
74
T E C H FA S T LY | A P R I L 2 0 2 1
75
T E C H FA S T LY | A P R I L 2 0 2 1
Once upon a time… Isn’t that how all stories begin? Maybe this should too. You see, just like any other story, this tale also has its protagonists (many in fact), it has dreams and ambitions, success, and failures, and just like any story, it has an antagonist, albeit an unlikely one. This story is incredibly significant, for if it had not unfolded the way it did, the comforts we are so inherently familiar with wouldn’t have existed. What could possibly shape our present this way? ‘Technology’ of course. Our daily existence is entangled with one piece of tech or the other. Would this have been possible if our protagonists, a group of innovators, loners, geeks, and hackers, had not had their way and gotten a computer into our homes? Perhaps not. But is this story about computers and how they became part of our lives, say from the ’80s or ’90s? ‘Yes’ and ‘No’. Yes, the personal computer gained popularity in those decades, but ‘No’, the computers’ advent is undoubtedly not so recent. We certainly cannot discount the efforts of those who came much before and set the standards in computing we follow even today. The story we are about to tell you achieved fruition in the 70s but found its feet a long, long time back, well into the 1800s, to be precise.
76
T E C H FA S T LY | A P R I L 2 0 2 1
Chapter 1 – Charles Babbage to Turing
In
the 1830s, Charles Babbage, considered by some to be the ‘Father of the Computer’, developed a prototype for a steampowered mathematical machine called the ‘Analytical Calculating Engine’. Along with Ada Lovelace, he spent the next 20 years or so getting the device to perform complex mathematical calculations. While their project never quite took off, it certainly set the standard for modern computer ideology. It took another 30 odd years for their idea to find its actual application when the first computer was born out of a severe number-crunching crisis. In the 1880s, unable to tabulate the U.S Census, the government sought a faster way to complete the process, leading to Herman Hollerith’s design of the first working punch card system. It was Herman Hollerith who established the company that would later become ‘IBM’.
has played. Here is an interesting fact; Turing, an active proponent of AI, invented the’ Imitation Game’ to check a machine’s ability to exhibit intelligent behavior. He also referred to the test as ‘Lady Lovelace’s Objection’ as she passionately believed that computers cannot be made to think. The test is still a gold standard in determining AI, and his theory of the Universal Machine is the basis of modernday ‘Computer Science.’ These events are critical in history because if these three, amongst many, had not stepped in with their revolutionary ideas, then the modern-day foundation of computers would never have been set. And their biggest hindrance? Technology itself, or rather the lack of it. Yes, there were glimmers of a technological future, but with electricity itself coming into American Homes, on a larger scale, only around the 1940s, things were still relatively primitive. But the 1950s was about to help things in a big way.
One world war and the Great Depression later saw the legendary Alan Turing present the notion of a ‘Universal Machine’ in the 1930s, capable of computing anything computable. If reading history is not your thing, then perhaps an evening watching the Imitation Game and Benedict Cumberbatch might give you an immense appreciation for the role Turing
It took another 30 odd years for their idea to find its actual application when the first computer was born out of a severe number-crunching crisis
77
T E C H FA S T LY | A P R I L 2 0 2 1
Chapter 2 –Setting the Stage By the 1950’s we were firmly in the midst of the industrial revolution, and innovation was the name of the game. The funding of WWII had provided the ideal springboard for technology. Prodigious inventions and modernization swept the world and gave way to the first commercial computer in 1952 and a host of other discoveries. From Hewlett Packard’s ground-breaking electronic innovations to computer languages and from Mainframes to the integrated circuit known popularly as the computer chip, the era of electronic computing had arrived, and so had Silicon Valley.
78
T E C H FA S T LY | A P R I L 2 0 2 1
The 1960s welcomed the prototype of the modern computer, with a mouse and a graphical user interface. We saw significant headways with Intel unveiling the DRAM chip, IBM inventing the ‘floppy disk’, and Xerox developing the Ethernet for connectivity. Commercial computers were on a roll, but now the stage had been set for something bigger.
Chapter 3 – The Birth of The Personal Computer
Th
e second half of the 70s dramatically reshaped the ‘computer’. A slew of minicomputers hit the market, and while calling them mini from today’s point of view might be a bit of an exaggeration, the reality was they were not taking up entire walls of a room.
The Altair 8800 took the credit for being the world’s first minicomputer. A part of a hobby kit requiring assembly, it was
designed by a little-known company called ‘MITS’. By then, the interest in owning a computer was already relatively high, and Altair’s entry price made it the most attractive new-age tech to own. The Altair did not come with a keyboard or even a monitor, for that matter. While it may sound strange, this computer with 256 bytes of RAM (yes, you read that right) first shipped, with only switches and lights as the interface, and all one could do with the machine was make programs to make the lights blink. And yet, it was a huge success, so much so that a short while later, MITS came up with the Altair 8800b, and this time around with significant upgrades. Seeing the opportunity to make the computer more attractive to hobbyists, Paul Allen and Bill Gates offered to write an interpreter for this computer in ‘BASIC’. After their success with this endeavor, they formed Microsoft on April 4th, 1975.
The Altair 8800 took the credit for being the world’s first minicomputer. A part of a hobby kit requiring assembly, it was designed by a little-known company called ‘MITS’.
79
T E C H FA S T LY | A P R I L 2 0 2 1
Chapter 4 – Apple Takes the Biggest Slice of The Pie On April Fools Day of 1976, the world’s most valuable brand as of 2021 was established. Ten days later, Apple released the Apple Computer, more popularly known as Apple-1. While famous, it was the Apple II’s launch in 1977 that got the ball rolling in the field of tech. Apple II was the company’s first consumer product designed to be used
80
T E C H FA S T LY | A P R I L 2 0 2 1
right out of the box and rapidly became America’s first home computer. In 1981, IBM introduced its own computer, code-named “Acorn”. One of the first computers to use integrated third-party devices and software, it was the ‘Acorn’, which popularised the term ‘PC’.
81
T E C H FA S T LY | A P R I L 2 0 2 1
Chapter 5 – The Machine of The Year is….
By
1984, Apple II was a humongous success. Having sold almost 2 million units, and with more and more businesses using computers, the personal computer had become a musthave accessory. Apple eventually launched Lisa in 1983, a significant idea that ‘failed’. Its learnings translated to the innovation of ‘Macintosh’ in 1984, which became the PC industry’s longest-running, most influential line of computers. Conceptually, every modern PC — including those that run Windows
and Linux — descends from it. By 1984, we had two tech camps – The MAC and The PC, both vying for top accolades. In an interview with Newsweek in 1984, Steve Jobs made a fascinating observation months after the Mac was introduced. “The next stage (of computers) is going to be as ‘agents.’”, said Jobs, “… Rather than help you, it will begin to guide you through large amounts of information. It will almost be like you have a little friend inside that box,” he continued. Prophetic words indeed.
It took another 30 odd years for their idea to find its actual application when the first computer was born out of a severe number-crunching crisis
82
T E C H FA S T LY | A P R I L 2 0 2 1
Chapter 6 – The Agent That Upscaled Our Lives
Th
e computer’s popularity picked up primarily because of interest in the ease of computing numbers. VisiCalc, introduced in 1976, was the software that accountants dreamed of. From there, the race was on - Word processing, Easy Writer, Lotus 123, all came with better, more evolutionary features, and the software wars had started.
But the messiah that emerged as the stalwart of all thing’s software was Microsoft Windows. While the Mac came with the classic Mac OS, other personal computers made by IBM, Compaq, HP all had to contend with the lack of a userfriendly graphical user interface. Windows filled a gaping hole in that space, steadily becoming the OS of choice in the ’90s.
Conclusion For most millennials, the period from the second half of the 1980s is still relatively fresh in memory. We saw advances in every aspect of computing. Computers went from walls to desks to your hands. From the dot-com era to the widespread acceptance of the internet and from the iPod to advances in AI, technology grew multi-folds. The tenacity and persistence of Turing, Jobs, Gates, and many more paid off and resulted in our better state of existence. Now we stand at another exciting juncture in history, with AI, IoT, and Big Data promising better productivity, efficacy, and connectivity.
So, this cannot really be a conclusion, for the story has only just begun. Its new chapters are being written by those who continue to tread our tech protagonists’ paths, even as we speak. As Turing rightly said -
“We can only see a short distance ahead, but we can see plenty there that needs to be done”.
83
T E C H FA S T LY | A P R I L 2 0 2 1
Ambika Bhandari
84
T E C H FA S T LY | A P R I L 2 0 2 1
85
T E C H FA S T LY | A P R I L 2 0 2 1
Remember the movie, WALL-E where every little thing was controlled by the machines -From household work to running the massive spaceship! Well, guess what? With Artificial Intelligence’s (AI) development, we are relatively close to that!
But do we have to fear AI? ‘Robo Journamlism’ is a buzzword in the News Industry! Ever read an article written by a BOT? You may have without even realizing it. That’s because they are designed to imitate the writing style, language, and structure like a human. For now, the Robo Journalists write short articles. But are expected to assist with further journalistic tasks in the near future. Yes, ‘assist’! You need not fear that the AI will take away your jobs.
86
T E C H FA S T LY | A P R I L 2 0 2 1
Why
Do
People
Fear AI? Not only the news industry but almost all industries are using bots. The Robots assist in carrying out the laborious tasks! But the primary concern with this is the fear that it will lead to unemployment. The fear is genuine as we have watched many movies where robots are taking over the world!
However, in reality, these machines are helping the industries work faster. Although it is replacing a few employees, these machines also require someone to operate them! Companies are hiring people who have the expertise to run massive machines. Technological knowledge has thus become essential in today’s time. People are afraid that if AI develops further in the future, it will replace more professions. This is possible only if Artificial General Intelligence (AGI) is developed. With AGI, the machines will think and work just like humans. However, as of today, it is possible only in Science Fiction! But if you are a CEO, Social worker, Doctor, Teacher, or Psychiatrist, you need not fear the AI. The professions which require humane qualities are entirely safe from the bots. Similarly, in the field of Journalism, the bots will only assist in producing rote articles. They won’t take over other journalistic works which require highlevel understanding and interpretation. For example, Investigative and Interpretative Journalism will always require humans! 87
T E C H FA S T LY | A P R I L 2 0 2 1
How Is AI Helping
JOURNALISTS WRITERS A good journalist needs to have an excellent analytic mind, understanding of the affairs, curiosity, and creativity! The AI is not capable of handling these qualities as of now. The Robo journalists are currently writing in the field of sports, weather, elections, and business. The news from these sectors is data-driven which is easy for AI to understand. Many large news outlets use automation in these beats to carry out the tasks. They leave the difficult work for the human journalists. If you are a journalist or a writer, you must have already come in contact with AI! Things like Google Translate or any other translation tool are part of Artificial Intelligence. Do you use Grammarly or Hemingway to sharpen your write-up? Then you are already taking the help of the AI. Do you use the Trint to transcribe your interviews? Well, that also works on the machine learning algorithms. AI is helping journalists and writers by making their work easier. Moreover, it is also economical.
88
T E C H FA S T LY | A P R I L 2 0 2 1
”
?
ACCORDING TO THE JOURNALISM AI REPORT, AI FREES THE JOURNALISTS TO WORK ON JOURNALISTIC TASKS THAT REQUIRE IN-DEPTH ANALYSIS. THIS IS ESSENTIAL WHEN MOST NEWS ORGANIZATIONS ARE FIGHTING AGAINST THEIR ECONOMIC CONDITION.
S AND S
89
T E C H FA S T LY | A P R I L 2 0 2 1
“Maybe a few years ago A.I. was this new shiny technology used by high tech companies, but now it’s actually becoming a necessity,” 90
T E C H FA S T LY | A P R I L 2 0 2 1
Machine Learning Gadgets in Newsroom Large news outlets and agencies such as Reuters are experimenting with many new AI technologies. In 2018, Reuters came up with Lynx Insight and News Tracer. These are artificial intelligence tools that support Reuters in newsgathering. The Lynx Insight identifies the on-going trends and key facts in a huge database. On the other hand, the News Tracer uses Twitter to find out any breaking news. Usually, news outlets hire a journalist for this task. However, with News Tracer, this work is done by the machine accurately. In the present times, social media has turned into the hot zone for breaking news. Therefore, having a bot look over helps a lot! It can detect the news easily and inform the newsroom about it. Bloomberg News has the ‘Cyborg’ to help them with business reports. It can study large databases and find out key facts within a few seconds. This is then used to generate reports. Cyborg is one such automated technology that saves time for the company. In 2014, the Associated Press got their hands on a software that can generate news stories. They tried it out with the sports news first. When everything went well, they started using this for business news too. The Washington Post has an AI known as the ‘Heliograf’. According to the wired.com article, the ‘Heliograf’ detects the important data from a massive database and then adjusts it into templates and phrases. It uses various templates and replaces the blanks to make different copies that are shared on various platforms. The software can also alert the user if it finds any issues in the machine. So, we can expect that soon more media organizations will employ AI to produce news stories on a large scale. The primary task of AI in the news and writing industry is to assist journalists and writers. They are programmed to do repetitive work. But in recent experiments, it has been found that AI can help in a wide range of tasks such as newsgathering. The creative aspects of the Journalistic work are for humans only. Although the AI algorithms can assist in this job. So, journalists and writers must learn how to use Artificial Intelligence to make their work easier. Google News Initiative is one such program that educates journalists and writers about machine learning in Journalism. You may visit the Google News Initiative training section to do the free online courses. You will not only learn about machine learning in Journalism but also about other Journalistic google tools. After completion, you will also receive an e-certificate free of cost. So, Artificial Intelligence and Journalists can work together to generate high-quality information for the public. This will not only help in strengthening the economic condition of the media organizations but also help in building public trust. According to Francesco Marconi, the head of research and development at The Journal, “Maybe a few years ago A.I. was this new shiny technology used by high tech companies, but now it’s actually becoming a necessity,”. So, it is probable that many more tools of journalism will require AI in the future. 91
T E C H FA S T LY | A P R I L 2 0 2 1
Suman Bambhniya
What Does AI Have In The Box For E-commerce?
92
T E C H FA S T LY | A P R I L 2 0 2 1
Introduction: Whether you are an entrepreneur aspiring to start an E-Commerce business or an employee presently working in an E-Commerce sector, AI packs countless surprises in the box for you. It’s AI that defines your business’s existence in the fast pacing and competitive marketplace. Without AI, it’ll be like a needle in a haystack. Digital Marketing itself isn’t enough for your business. Of course, it’s essential for brand awareness, but when it comes to retaining the brand name, it means retaining customers by streamlining the buyer’s journey. To sustain the saturated market and establish a fruitful customer relationship, AI brings automation for your E-Commerce business. Automation provides a unique feature of multi-tasking by cutting down major time-consuming activities. It facilitates sufficient time for better customer service. You will find e-commerce users at every corner of the world, but if you are to spend your career in the same, then here’s what you must know.
93
T E C H FA S T LY | A P R I L 2 0 2 1
What exactly is E-Commerce Automation? E-Commerce automation is just one of the areas where automation has spread its roots. Automation technology is a versatile and convenient technology that uses collective tools to identify and automate repetitive and mundane tasks with dynamic strategies. It plays a vital role from the start to the end of a customer’s buying journey. From a cold lead to making them loyal customers, automation does it all.
What’s all in the Box? 1. Dynamic E-mail Content Strategy. Digital Marketing campaigns will surely bring a significant number of visitors per day to your website, but converting a visitor to a hot lead can be a mundane task. After a visitor visits the website, a dynamic form filling strategy is enabled. It facilitates them to fill in the details with an e-mail id. Imagine you are in a store looking to buy a shirt, but the salesperson shows you with best denim jeans collection, you may have liked it, but that’s not on your tick-list. The salesperson continues showing you items other
94
T E C H FA S T LY | A P R I L 2 0 2 1
than shirts; won’t that trigger you to leave that store? Similarly, bombarding customers with irrelevant e-mails and notifications can trigger a customer to unsubscribe from your e-mail list. Hence providing relevant content is crucial. To accomplish this, you can sync CRM with the e-mail list. Here, automation studies the customer’s behavior, preference, interest, age, location, gender, etc., and prepares a demographic and behavioral report. A personalized e-mail is sent to the customer based on the report.
For Instance: If a customer has wish-listed an item, selected an out-of-stock product, or added a product to the cart but did not proceed with payment yet, an automated personalized e-mail is sent to the customer reminding them about the product status in their cart.
The ‘A/B’ testing feature allows marketers to use two different strategies to trigger a lead to take a Call-To-Action (CTA).
Automation provides a unique feature of multi-tasking by cutting down major time-consuming activities. It facilitates sufficient time for better customer service. 95
T E C H FA S T LY | A P R I L 2 0 2 1
96
T E C H FA S T LY | A P R I L 2 0 2 1
1. Customer Retention Artificial Intelligence with E-commerce has a lot more to offer than just selling. Only selling won’t benefit your business, but loyal customers influence the Return-on-Investment (ROI). By integrating automation, you can establish a long-term fruitful relationship with a one-time customer, thereby converting them into a loyal one. So, how can this be achieved? · Up-selling: This encourages your one-time customer to shop for a better version of the product/ service. It is comparatively expensive than the primary purchase made. · Cross-selling: Cross-selling encourages an existing customer to purchase an additional or complementary product. · Cycle-based Selling: This type of selling can trigger an existing customer to purchase seasonal products.
2. Chatbot “All customer service representatives are busy right now. Please drop a message, and we’ll call back shortly.” Ever heard this? Here, chatbot comes into the picture. It offers 24*7 availability, instant response, and detailed answer to your query. Chatbots thus assist in better customer service, increased interaction with visitors.
3. Automated Inventory Management With automation in inventory, it allows real-time tracking and managing stocks. It prevents understocking and over-stocking, thus reducing non-fulfillment and warehouse space wastage. An automated notification about stock status is sent to the supplier at a predetermined level if it is not as per the record (predictive analysis). For example, it predicts the order date of stock consumed 90% and signals for ordering the stock.
4. Customer Order Management One of the sole reasons for losing a potential customer is a delay in the order delivery. From order placement to the feedback stage, automation helps in serving the customer better. Accurate shipping labels avoid human error at the time of dispatch.
5. Analytics Tracking the buyer journey becomes simple with the analytical tools AI offers. By integrating it allows the marketers to target and re-target the audience. It provides the real-time data of the customer. It ultimately tracks the sales and marketing funnel and analyzes ROI. It helps to measure• Marketing campaign effectiveness. • Customer engagement. • Call-To-Action. • Website optimization required. • User-friendly navigation. • Demographics. • Inventory. 97
T E C H FA S T LY | A P R I L 2 0 2 1
98
T E C H FA S T LY | A P R I L 2 0 2 1
Conclusion AI uses the automation tool to ensure smooth operations flow into the system. Every step in the buyer journey is automated and results in an enhanced customer experience. Businesses are designing their online services to imitate the in-store experience. Additionally, AI reduces the susceptibility towards fraud by learning false attempt patterns. The “ifs” and “buts” were the traditional approach to predicting business decisions. Today, AI’s prediction is far more reliable and has a great probability of success. E-commerce is just one sector where AI has spread its roots. The future brings advanced tools to integrate to enhance the experience at both individual and business levels.
99
T E C H FA S T LY | A P R I L 2 0 2 1
Discover more of our Magazine today...