Quriosity Volume 10 Issue 01

Page 1

VOLUME 10 ISSUE 1

JANUARY 2019

QURIOSITY MONTHLY NEWSLETTER OF QUANTINUUM – THE QUANT & ANALYTICS COMMITTEE @SIMSR

IN-MEMORY ANALYTICS An advanced business intelligence methodology that is coming up and will certainly change analytics in the near future | p. 03

QUANT GURU D.R. Kaprekar, was an Indian recreational mathematician who was known for describing several classes of natural numbers. | p. 13

QUANT FUN Surprising puzzles and questions to rack your brain | p. 23


EDITOR’S NOTE Welcome to the latest issue of Quriosity, the monthly newsletter of Quantinuum! Quantinuum - the Quant and Analytics committee of K.J. Somaiya Institute of Management Studies and Research aims to empower students and professionals alike to organize and analyze numbers and in turn, to make good and rational decisions as future managers. The newsletter published monthly consists of articles which will enrich the young minds by informing about the contributions made in the field of quant, analytics, and mathematics. The objective of Quriosity is to publish up-to-date articles on data analytics, alongside relevant and insightful news. This way the magazine aspires to be vibrant, engaging and accessible, and at the same time integrative. In this issue we cover the guest lecture by Ms. Reshma Shah on Analytics in Retail and Ecommerce, organized by Quantinuum in which she shared her knowledge and industry insights. She also gave guidance on career and education. We have articles on how big data is being used in oil and gas industry and how experience economy is flourishing. The main article is about in memory analytics. In Quant Fun, we have notched up the level of difficulty of Sudoku puzzles with this edition. In Quant Guru, the focus is on contributions of Dattatreya Ramachandra Kaprekar, also known as "Ganitanand", an Indian recreational mathematician who was known for describing several classes of natural numbers. We sincerely hope that this will help our readers gain insights on the latest developments in field of analytics and data science. If you wish to submit articles or news items, either individually or collaboratively, you are welcome to write to us at – quriosity.quantinuum@gmail.com Thank you and Happy Reading! Quriosity Editorial Team Quantinuum@SIMSR Editorial Team: VVNS Anudeep (+91 9441201685) Khushbu Mehta (+91 9930158610) Tanmay Nikam (+91 9699288587) Kaustubh Karanje (+91 7738219050) Dhyan Baby K (+91 9809245308) Shubham Thakur Saumya Joshi

1


IN THIS ISSUE

Main story In-memory Analytics by Joseph Toms … pg.03 Sub article Experience economy by Santanu Mitra … pg. 06 Sub article Big data in Oil and gas industry by Ekta Arora … pg.09 Quant Guru D.R. Kaprekar (b. 1905-1986) by Dhyan Baby K … pg.13 Curiosity Update by Saumya Joshi … pg. 15 News Digest by Saumya Joshi … pg. 16 Event Report by Manish Mishra …pg. 18 Book Review Digital marketing Analytics Author: Chuck Hemann & Ken Burbary by Akshay Nandan R … pg. 20 Quant Tutorial Tutorial on Tableau by Akshay Nandan R … pg. 22 Quant Fun by Tanmay Nikam … pg. 23 Quant Connect … pg. 28

2


QUANT COVER STORY

In-Memory Analytics by Joseph Toms PGFS 2018-20

In-Memory analytics is an advanced business intelligence methodology that is coming up and will certainly change analytics in the near future. Something that is in-memory corresponds to some sort of data that resides in the RAM (Random access Memory) as compared to data that is stored in a disk. When the CPU does data processing, the data must be in-memory. If not, the data has to be brought to in-memory making the process slower. Thus, in-memory computing is a much faster process, especially when dealing with chunks of data. An analogy to understanding what In-Memory analytics is all about can be thinking about how our brain stores a million pieces of information every single day. This information is stored in an organized way inside the brain. It is organized in the form of nodes that represent concepts and the nodes are linked based on their relatedness. It is as if this information is stored in folders with different files inside them tagged with different captions. Now retrieving this information takes time since it requires us to go to that particular folder, get the required file and read it. In-Memory Analytics does something similar. It involves taking the storage space and making it bigger, so that everything is instantly available, available in real time, reducing delay and helping to remove latency. Compared to the traditional databases, where 90% of the data was stored in disk-based databases and only 10% was stored in memory, the InMemory database will store 90% of its data in memory reducing the time majorly, especially when applications have to access data to add additional features including real time information. 3


With the recent fall in RAM prices and the entry of the 64 bit operating systems, manufacturers have been able to manufacture machines with even few Terabytes of storage memory, which can go on to hold memory as huge as an entire data warehouse. This means that the traditional methods of storing data in tables on the disks and indexing or optimizing queries like before will not be further required. Even sophisticated models can be executed in near real-time and business intelligence applications will become dramatically more flexible as these aggregated “cubes� will no longer be necessary. In-memory computing, is also, much better than the caching technique, which is a widely used method of speeding up query performance.

The business intelligence has two main parts: Reporting and Analytics. Analytics is the more difficult part which includes analysing huge datasets, doing computations and comparing between the data in order to understand shopping patterns, usage patterns etc. The quickness of this analysis depends on the quickness to accessibility and computation of these datasets. In simpler terms, it depends on how quickly we can get to these datasets and how quickly they can be assessed or evaluated. In traditional data warehouse method, data is stored in warehouse and part of data is extracted from data marts for pre-analysis and pre-aggregation. The results stored in data mart are used by business intelligence applications for reporting. But in In-house computing analytics, business intelligence applications do not access partial data, rather the results are computed real-time by querying the built-in calculation engine. For IMC approach, data is frequently updated and it provides proper visualization of the results queried. A point where the in-memory computing revolutionizes analytics, since the entire data would be available in-memory as compared to a time-consuming disk fetch. Consider a scenario where an entire report is presented to the user and the entire data needs to 4


be made available so that the user can drill through the data using filters. For a conventional Business Intelligence tool, it takes a huge amount of time to query the entire data from the database and they normally don’t do this. Instead, they pull the data at a high level and fetch the detailed data only when the user drills down. Each time the user tries to fetch data, it involves a query, consuming a lot of time. However, in-memory Business Intelligence computations makes the entire data available right in the beginning. The speed at which the data loads as well as the speed at which every further drill down happens, happens at the speed of thought as no disk queries are required. This form of quick interactive analysis is really necessary, especially when the user is trying to figure out unknown patterns or discover new opportunities. With conventional reporting tools, users will have to stick to specific questions, such as “What are the sales for this quarter?” However, with in-memory visualization tools, this question can be less specific and analytic, asking to “Show the data and the patterns involved in the sales for this quarter.” The ability to perform ‘what-if analysis’ on the go is another feature of the in-memory tools. For example, users would want to know how the profits of this quarter would change if the prices of several items were increased, transit times reduced etc. A conventional OLAP (online analytical processing) tool would require the entire database to be recalculated overnight whereas using an in-memory technology, the results are immediately available and held in memory. Many of the popular Analytics tools out there like QlikView, Tableau, TIBCO Spotfire, SAP HANA, Oracle Exalytics makes use of in-memory processing to varying degrees. Though the covering required for in-memory technology is expensive and the business intelligence applications that can use this speed are limited in number, this method of computing should be definitely looked upon as an industry trend. It will not just help companies to gain the speed but will also help come up with deeper insights to more granular levels, needed to increase revenue. Data scientists will not be restricted to a sample; they will have the liberty to apply as many analytical techniques and iterations as needed to find the best model and continue exploiting the potential of the vast memory available with the present day computers.

References: https://www.webopedia.com/TERM/I/in_memory_analytics.html https://www.softwareag.com/resources/In-memory-Analytics https://www2.deloitte.com/content/dam/Deloitte/de/Documents/technology-mediatelecommunications/TMT_Studie_In_Memory_Computing.pdf https://aminemekkaoui.typepad.com/business_intelligence/inmemory_analytics/

5


SUB ARTICLE

Experience Economy and Analytics by Santanu Mitra PGDM Core 2018-20 With the passage of time there has been a fundamental change in the very fabric of economy where we have gone from the agrarian based economy of commodities to industrial economy based off of goods and further to service economy. What interestingly happened was goods became commoditized in the service economy meaning they were treated as commodities. As a result of which people did not care much about the brand, the only concern was price. In fact in present scenario, Internet is the greatest force of commoditization, as the customers can easily compare the prices with ease and eventually it pushes the customers to the lowest price. The services are also becoming commoditized these days for example if we look at financial services, buying or selling a block of shares used to cost a hundred dollars with a full service broker now costs only three to four dollars with an internet based broker. So, as goods and services are becoming mere commodities it is very important to move beyond goods and services to a new economic value, i.e. staging experiences for the customers. Experiences are distinct economic offerings, it’s basically using goods as props and services as stage to create a memory which is the hallmark of experiences. Starbucks creator Howard Schultz had a great coffee experience in Milan Italy and brought the same to USA and now increasingly spreading across the globe. Similarly Nespresso gives the emphasis on the fact that if they make the customers feel the coffee making experience then it becomes lot easier to sell and hence they have innovated the Nespresso boutiques where customers com and are exposed to various products. The staffs can make a coffee for the customers using the Nespresso machine with the idea of making the customers experience the product.

6


Now if we analyse, the coffee beans for a cup of coffee cost only two to three cents and if we further roast them and further pack then the price ranges from five to fifteen cents. If the coffee is brewed for a customer in a cafĂŠ it will cost a dollar or dollar and half. But as the coffee is surrounded with the ambience of Starbucks we pay as much as six dollars for a cup of coffee having beans worth two to three cents! Experiences are distinct as mentioned and can be staged at various stages, US army does that in recruiting. They have designed a game which gives the experience of real time mission. The cost involved is one tenth of the traditional advertisement cost. On the other hand Whirlpool has a program called “The Real Whirledâ€? where the Whirlpool employees stay in a house which they are not allowed to leave and the day to day activities have to be carried using whirlpool machineries. The idea is to let the sales person experience the product first hand before they actually sell to the customers. While most sales take place in person but with the internet coming into picture thousands of new customer experience are created that have a strong connection to the sales. According to research, around 84 percent of the customers use digital content at some point while buying an automobile. As a result there has been a significant drop in the customers visiting showrooms. Success in this era is all about evaluation of current data-points. This means keeping a track of methods by which customers interact with digital technologies which are changing quite drastically. Mobiles are slowly replacing the personal computers at the same time internet of things and autonomous chat devices are getting into the picture. Keyboards which gave ways to touchscreens and now presently the speech and gesture recognition today are the 7


technologies in use. The crux of the matter is with every change a new demand is being created by the customers both in terms of the experience and interactions. As per a latest research in UK, approximately 30 percent of consumers have said that they would actually stop visiting websites that were lacking personalization. And as much as 86 percent of consumers have said that personalization plays a crucial role in their purchasing decisions. Various personalization channels offer the potential to increase revenues by 10-25 percent. In particular, personalization can be a strong lever for companies seeking to regain competitiveness versus market leaders. Take for example, a frequent business traveller searching for a trip to London might have his search results sorted which are prioritized for an airline on which he has status, automatically suggesting his preferred window seat and offering to book a white car pick-up at the airport and a hotel with river view with several business amenities that is opposite his office. In a scenario, where each client contact point is an opportunity and a threat at the same time, the ability to analyse, interpret and value data has become the mantra to success. But the irony remains in the fact that this new oil of the economy, it is mostly untapped as yet. It is estimated that among all of the data generated in recent years all over the world, only about one percent has been analysed and valued. Collecting data is not the tough part in this era. The challenge is identifying the 20 percent of data that will eventually generate the biggest impact, the next part involves integration of the new data across multiple sources into existing business models, and converting the same into value. It is estimated that about twothirds of the value of data today can be used to enhance customer experiences and increase personalization. In this data-driven world platforms like Facebook, Instagram, YouTube etc. are stages behaving as experience hub online. Nobody heard of Blendtech until the CEO Tom Dickson started putting videos in YouTube “Will it blend?� Since then the sales have increased by 700%. There is no room to stand still in today’s rapidly shifting, unpredictable technology and consumer landscape. Focusing on experiences, personalization, and smart data are tools that can help all companies keep on dancing even as the music changes. References: https://www.forbes.com/sites/danielnewman/2015/11/24/what-is-the-experience-economy-shouldyour-business-care/#42aecdf01d0c https://www.localist.com/blog/experience-economy/

8


SUB ARTICLE

Big Data in Oil and Gas industry by Ekta Arora MMS 2018-20

For a society, one of the primary assets for its success is a low cost energy source. Low cost is driven by supply greater than demand. An oil and gas industries are viable only when they find supplies in sufficient quantities. Finding and producing hydrocarbon is economically risky and technically challenging which generates a large amount of data. To integrate and analyses this data new technologies and approaches have to be designed so that faster and accurate decision can be taken. All these things done correctly will lead to safely finding new resources, increasing recovery rates and reducing environmental impacts. Big data is a term used to describe both structured and unstructured large volume of data that inundates a business on a day-to-day basis. For better decision and business strategic move big data can be analysed for better insights. There are three V’s that describe big data, volume, variety and velocity.

9


Big data analytics could be new to some industries, however, oil and gas industry always require to deal with large amount of data in search to learn what lies beneath the surface. Data visualization, seismic software and now a new generation of pervasive computing devices, sensors that gather and transmit data have further helped these industries to strategically deal with their data and open new possibilities. The advanced analytic capabilities and new tools, helped the producer of oil and gas to capture more real-time details at relatively lower cost, which could help them boost oilfield and plant performance by six to eight percent. The digital data regarding the oil and gas fields, drilling, exploration and production processes, oil products and its marketing, and the status of the oil and gas companies in the market are entering into the management centres of the companies. This data is then used for the exploration and development of the oil and gas beds, investment decisions, and the development of oil and gas production using sophisticated modelling techniques. Big data provides technologies to reduce the duration of processing such large scale data, speed up geological and hydrodynamic modelling and perform more complex modelling. The leading global players in Big Data market in the oil and gas sector are HP, Cloudera, Hitachi Data Systems, MapR Technologies, IBM, Oracle, EMC, and SAP. This includes the development of high performance computing systems that realize collection of geologicalmining and other data from the sensors at all the technological chain of oil and gas production and processing, and therefore the analysis and management of large volumes of data. A large number of different types of sensors like 4-component sensors in the oil and gas seismic exploration, 4 dimension seismic sensors in the field of geophysical surveys, and the optical fibre sensors in oil and gas extraction, wells, processing and transportation systems are widely used. These sensors receive large volume of data, for example, the volume of information obtained throughout the seismic exploration of a field can reach tens of terabytes. A optical fibre sensor in a well can give real time control of the performance of the pipe and its condition based on different parameters like temperature, pressure etc. each at very small distances. The volume of data received from such sensors from a single well daily can reach several terabytes. Geological modelling of oil and gas fields shapes 3D model of a layer which helps to estimate the hydrocarbon reserves in the layer. The 3D hydrodynamic model developed according to the geological model shows the changes in the volume and properties of reserves in the layer of the field. Based on the information received from the sensors installed in the well, the hydrodynamic and geological models are adapted and various reports, tables and texts are generated. As a result of the geological and hydrodynamic modelling of oil and gas fields, terabytes, even petabytes of seismic, geophysical and mining data are processed. A large amount of unstructured and semi-structured data is collected from various outside sources like social networks, media, various reports, websites as oil and gas companies are in focus of commercial structure, authority and media. Big Data technologies has the following potentials: â—? Big Data technologies help to significantly reduce the price of storage of data and this can be used to analyse the data. 10


● Unstructured data integration enables to work with Hadoop data with the help of structured data processing tools (SQL etc.). ● The software developed within Apache Hadoop technology enables to provide high performance scalable data processing using horizontal scaling. ● Big Data technologies enable easy comprehension of all the data necessary for operative management decisions. ● Big Data and cloud structure are developing in parallel. Cloud technology provides the flexibility of Big Data for large scale data processing and reduces the cost. Software and hardware sets like IaaS (Infrastructure as a Service) or PaaS (Platform as a Service) can be rented in the cloud for Big Data.

Shell is one of the companies that uses Big Data technology. They uses fibre optics cable, created in a partnership with Hewlett-Packard and the generated data is transferred to its private server operated by Amazon Web Service. It gives accurate data and helps in comparing alongside with others around the world. This results in better recommendation about where to drill. It helps in forecasting the amount of output of the reservoir. Sensors are used on the physical machines that record all their performance and compare it with the previous data. This ensure that the machine are working properly and reduce the time spent due to breakdown and failure. Shell is a vertically integrated company, so it also uses Big Data to streamline the transport, refinement and distribution of oil and gas. Complex algorithms takes into account the cost of producing the fuel and other factors like economic indicators which is used to set price at the pump and allocate resources. Big Data strategy is at an experimental level and only a few companies have been trying to use it. A large amount of data is being intensively used and analysed by the oil and gas industries for generating a viable output for which Big Data technology is of great help for producing efficient results. Companies with better Big Data capabilities are twice as likely to be on top of financial performance. Furthermore, it is five times more likely to make faster decisions than their peers. Real time Big Data analytics may reduce the costs and risks, ensure more efficient oil production, improve security and compliance with regulatory requirements and the quality of decisions. 11


References: http://analytics-magazine.org/how-big-data-is-changing-the-oil-a-gas-industry/ http://customerthink.com/what-is-the-importance-of-big-data-in-the-oil-and-gas-industry/ https://www.offshore-technology.com/features/big-data-in-oil-and-gas-tech/ https://www.bain.com/insights/new-possibilities-in-big-data-analytics-in-oil-and-gasgulfbase/

12


QUANTGURU

D.R. Kaprekar (b. 1905-1986) by Dhyan Baby K PG FS 2018-20 Dattatreya Ramachandra Kaprekar, also known as "Ganitanand", was an Indian recreational mathematician who was known for describing several classes of natural numbers which included Kaprekar, Harshad and Self numbers. He is best known for discovering the Kaprekar constant - named after him. He was a school teacher by occupation and is well known in recreational mathematics circles for his contributions. D. R. Kaprekar was born on 17 January 1905 in Dahanu, India, a town located on the west coast of India, about 100 km north of Mumbai. His mother died when he was just eight years old and was brought up by his father who was a clerk fascinated by astrology. This had a deep impact on him and was one of the reasons for Kaprekar’s love for calculation and mathematics. Kaprekar completed his secondary education from Thane, Mumbai and further studied in Fergusson College in Pune in 1923. He attended University of Mumbai for this bachelor’s degree but never received any formal postgraduate training. Still he published extensively, writing about such topics as recurring decimals, magic squares, and integers with special properties. Kaprekar won numerous awards including the Wrangler R P Paranjpe Mathematical Prize in 1927. He is best known for discovering the Kaprekar’s constant - 6174. To see the magic of Kaprekar’s constant, choose any 4 digit number where all digits are not equal. Rearrange the digits to form the largest and smallest number and subtract the smaller from the larger. Continue the process with this number and in at most seven steps you will reach the Kaprekar’s constant. Such is magic of Kaprekar and his numbers. Example below: 7433 - 3347 = 4086 8640 - 0468 = 8172 8721 - 1278 = 7443 7443 - 3447 = 3996 9963 - 3699 = 6264 6642 - 2466 = 4176 7641 - 1467 = 6174 There are many such interesting applications of his works. Please refer to the links below for more information. His contributions has garnered international recognition and have been published frequently. Kaprekar died in 1986 in Devlali, Maharashtra. Reference: http://www-history.mcs.st-andrews.ac.uk/Biographies/Kaprekar.html 13


https://en.wikipedia.org/wiki/D._R._Kaprekar http://www.wikiwand.com/en/D._R._Kaprekar

14


CURIOSITY UPDATE by Saumya Joshi PGDM 2018-20 Rising global stature of ISRO: 45 countries to be trained in making nano-satellites Around 45 countries will be trained by the Indian Space Research Organisation (ISRO) to build Nano-satellites as part of India’s Space Diplomacy. Five South American countries including Argentina, Brazil, Chile, Mexico, Panama, along with Egypt, Indonesia, Kazakhstan, Malaysia, Mongolia, Morocco, Myanmar, Oman, and Portugal are part of the first batch getting trained by ISRO. According to ISRO, this programme is the agency's initiative to commemorate the 50th anniversary of the first United Nations Conference on the Exploration and Peaceful Uses of Outer Space (UNISPACE+50) in 1968. Reference: https://www.financialexpress.com/lifestyle/science/rising-global-stature-of-isro-45-countriesto-be-trained-in-making-nano-satellites/1450693/ NASA and China collaborate on Moon exploration The space agencies of the United States and China are coordinating efforts on Moon exploration, NASA said, as it navigates a strict legal framework aimed at protecting national security and preventing technology transfer to China. NASA’s lunar orbiter will pass over the Chang’e 4 landing site on January 31 and will snap pictures, as it did for the Chang’e 3 in 2013. Reference: https://www.thehindu.com/sci-tech/nasa-and-china-collaborate-on-moonexploration/article26041965.ece NASA may decide this year to land a drone on Saturn’s Moon Titan A team of scientists is working on a proposed mission called “Dragonfly” which aims at combining terrestrial drone technology and instruments honed by Mars exploration to investigate the complex chemical reactions taking place on Saturn’s largest moon. If this mission is undertaken, Dragonfly would be launched in 2005 and arrive at Titan in 2034. Reference: https://www.space.com/43010-dragonfly-mission-would-put-a-drone-on-titan.html

15


NEWS DIGEST by Saumya Joshi PGDM 2018-20 How Aadhar’s Database can be used to train a surveillance AI for the Indian Government Aadhar contains the world’s biggest collection of personally identifiable and biometrically verifiable information on Indian citizens. It would not be a stretch to suggest that the government might implement an AI to learn from patterns in Aadhar data. NITI Aayog has been one of the biggest parties pushing for higher penetration for AI adoption in India. Reference: https://www.analyticsindiamag.com/how-aadhaars-database-can-be-used-to-train-asurveillance-ai-for-the-indian-government/

How Data Analytics backed by AI and ML is transforming the BFSI sector The financial industry is a highly regulated and dataintensive industry. With the invent of new entrants like FINTECHs, digital and payment banks, new regulations and change in customer behaviour traditional banking system is facing and feeling an external disruption and tension to reinvent itself and critically examine its business processes to not only get more clients but how to enhance existing customer experience. Reference: https://www.analyticsindiamag.com/data-analytics-ai-ml-bfsi/

How Predictive Analytics is transforming the Ad-Tech Industry It's not possible to think about today's advertising industry without appreciating the pervasive use of ad tech, the tools that enable brands to target, deliver, and analyze digital advertising. For roughly the last six years, the two primary ad tech tools, programmatic ad campaigns and exchanges, have increasingly been the sails propelling a $628 billion global digital advertising industry. Reference: https://tdwi.org/articles/2019/01/18/adv-all-how-predictive-analytics-transforming-ad-techindustry.aspx

16


Artificial Intelligence is a Great Detector Tool In an inexorably digitized world, cyber-attacks are growing in volume. With more organizations utilizing the web for their very own focal points, cybercriminals are searching for approaches to infiltrate security safeguards. AI gives instant insights so that organizations can come to an obvious conclusion regarding dangers significantly more effectively, lessening response times and making a business’ security more compliant. Reference: https://www.analyticsinsight.net/artificial-intelligence-is-a-great-detector-tool/ Face Detection with Python using OpenCV Face detection is a computer vision technology that helps to locate/visualize human faces in digital images. This technique is a specific use case of object detection technology that deals with detecting instances of semantic objects of a certain class (such as humans, buildings or cars) in digital images and videos. With the advent of technology, face detection has gained a lot of importance especially in fields like photography, security, and marketing. Reference: https://www.datacamp.com/community/tutorials/face-detection-python-opencv Microsoft Launches Asia’s Largest AI and IoT Lab In Shanghai Microsoft will soon open their largest artificial intelligence (AI) and the Internet of Things (IoT) lab in Shanghai, in a bid to target China’s growing business sectors ranging from manufacturing to healthcare. It will be the third, dedicated exclusively for research and development for AI and IoT.

Reference: https://www.analyticsindiamag.com/microsoft-launches-asias-largest-ai-and-iot-lab-inshanghai/

17


EVENT REPORT

Analytics in Retail and E-commerce by Manish Kumar PGDM RM 2018-20 Guest lecture on Analytics in Retail and E-commerce “For every two degrees the temperature goes up, check-ins in ice cream shops goes up by 2%.” - Andrew Hogue Today’s retail environment is the most competitive it has ever been. Online and mobile channels have created additional pressure on retailers to offer lower prices. Staying relevant in this environment and offering a seamless Omni-channel experience is more difficult than ever before, so now, more than ever, retailers need to find ways to unlock Omni-channels, multi-channel, O2O aspects, improve store efficiency, drive sales, optimize inventory levels across stores and products, enhance the effectiveness of marketing campaigns and deliver a better shopping experience. To cater the same objective, Quantinuum – The Quant and Analytics committee of K J SIMSR organized a guest lecture on 21st January, 2019.

18


The guest lecturer for this event was Ms. Reshma Shah who has about 10 years of experience in Analytics in E-commerce and Retail Industry. Being an alumnus of K J SIMSR (20042006), she is currently working with Chewy.com as their lead Data Analyst. The guest lecture was lit with many informative and inspirational discussions about Data analytics and how it can help you mold your career in future. From the basics to Analytics i.e. Descriptive and Prescriptive analytics and how Forecasting and Modeling helps in decision making. Different ways in which Data science is utilized in connecting information all together. She quoted that “If we know what our customers are buying then we can surely know them better than them”. She also mentioned in passing about how Retail Analytics is being used in various retail giants like Future groups, Reliance retail, Tata group, Aditya Birla group in India and abroad in their Marketing, Merchandising, Pricing & Supply chain. Marketing to new as well as existing customers can be done more effectively while taking into account individual’s perspective by creating preference designs and models. Merchandising on the other hand can be evaluated based on analyzing customer feedbacks by doing sensitivity analysis using various languages like SAS, R, Python, SPSS. While discussing about pricing, she explained how it is determined for various merchandise in the market and various concepts like price optimization, price markup and markdown, price values, price elasticity by giving examples and citing about occasional discounts and sales. Core items, tail items, flagship items, cash machines etc. are explained based on the sales and gross margin they cater in a particular retail store and how these effects in taking retail decisions. Talking about Supply chain, she explained the importance of achieving a perfect match in an Omni channel offering. Optimization of transportation cost and understanding competitive pricing with product margin and direct implementation. Various E-commerce tools which helps in e-commerce analytics like Hero Bunner, Adobe Omniture Analytics, Google analytics, A/B & Multivariate testing, Web Properties testing are briefed. She further cleared doubts about website developing, Java Scripts codes, Retargeting concepts, cookies, Email acquisition, Wireless tech Finder, Selection v/s suggestion etc. She also guided how jobs in Analytics and Architecture can be useful to climb the data science ladder.

19


BOOK REVIEW

Digital marketing Analytics by Akshay Nandan R PGDM 2018-20

Title: Digital marketing Analytics Author: Chuck Hemann & Ken Burbary

For any marketer who is trying to leverage the digital platforms to promote their product or service, having a fair knowledge about the digital analytics tools is fundamental. Authored by Chuck Hemann & Ken Burbary, “Digital Marketing Analytics� enables one to learn and apply such tools that are widely used by tech firms. Also, it helps one to learn to make the best use of online consumer data in various aspects like improving customer service, launching a new product and so on.

20


The book begins with a few basic concepts in digital media and analytics, which is most likely to give a traditional marketer a better idea about the digital landscape. This segment of the book lays emphasis on traditional media analytics as well, and reveals how it complements the digital route. The authors have stressed the importance of using digital and traditional data in tandem, and developing integrated programs to be more effective online.

Next comes the crucial part – picking the tools of the trade. Here, it not only gives an outline on the parameters to be considered to choose social media listening and engagement tools, but also tells you how to evaluate the same and take a call. Apart from this, it also covers search analytics, audience analysis and content analysis, and helps you target your customers better. The next step is to learn to put these concepts into action, by developing a social media listening program, which is explained with the help of cases. Towards the end, the authors talk about what the future holds for digital analytics, and how it will evolve over time. Comparisons are drawn between mobile marketing and marketing activities done on other digital channels. As the consumers have evolved over time and have switched to smartphones and tablets to gain information, the marketers are also expected to adapt to recent techniques in order to understand these consumers better. The authors touch upon the recent trends in mobile analytics like mobile device reporting, audience metrics, and app performance measurement that are essential to track the performance of mobile marketing activities. This makes it easier for digital marketing enthusiasts and analysts to update their skill sets according to the changes in the industry and develop suitable digital programs.

21


QUANT TUTORIAL

Tutorial on Tableau-2 by Akshay Nandan R PGDM 2018-20

Formatting the axes on Tableau Here is a tutorial to format axes on Tableau: • • • •

Right-click the axis, choose Edit Axis Within the Edit Axis panel, choose the Tick Marks tab For each Major and Minor tick marks, choose from one amongst the subsequent options Click OK.

Reference: https://data-flair.training/blogs/tableau-formatting/

22


QUANTFUN by Tanmay Nikam PGFS 2018-20

1. Sudoku Challenge (1): Moderate difficulty puzzle

23


2. Sudoku Challenge (2): Very difficult puzzle

3. Two trains, separated by a distance of 80km, are running towards each other on the same track at a speed of 40kmph. A bird takes its flight from train X and flies towards train Y at a constant speed of 100kmph. Once it reaches train Y, it turns and start flying back toward strain X. The bird keeps flying to and forth till both the trains collide. Determine the distance travelled by the bird.

4. There are 2 jugs with 4 litres and 5 litres of water respectively. The objective is to pour exactly 7 litres of water in a bucket. How can it be accomplished? 5. There are 10 stacks of 10 coins each, where each coin weighs 10gms. However, one of the stacks is defective, and that stack contains coins which weigh 9gms. Determine the minimum number of weights needed to identify the defective stack.

24


Answer 1

25


Answer 2

26


Answer 3 Velocity of approach for two trains = (40+40) km/hr. Time taken for the trains to collide = 80km/80km/hr. = 1hour The total distance travelled by the bird = 100km/hr. * 1hr = 100km While the above is an intuitive way of solving this problem, the problem has a mathematical solution based on harmonic progression whereby the sum of an appropriate HP will provide the answer. Answer 4 The approach here is to initially fill the 5L jug with water and empty the same into the 4L jug. The 5L jug will be left with 1L of water, which is poured into the bucket. Meanwhile, empty the 4L jug.The above step is repeated, so that the bucket now is filled with 2L of water. Finally, fill the 5L jug with water and empty the same into the bucket. The bucket will now have 7L of water, as you add % L directly to the previously collected 2L of water in the bucket. Answer 5 To solve this problem, the trick lies in creating a weighted stack for measurement, which will enable the candidate to identify the defective stack in one measurement. A coin is taken from the first stack, 2 from the second, 3 from the third, and so on. This will give a total of 55 coins in hand. If none of them are defective, they would weigh 550gms together. However, if stack 1 turns defective, the total weight would stand at 549gms; defect in stack 2 would result in a total weight of 548gms; and so on. Therefore, just one measurement can help the candidate identify the faulty stack.

27


QUANT CONNECT Quantinuum, the Quant and Analytics committee of K.J. Somaiya Institute of Management Studies and Research aims to empower students and professionals alike to organize and understand numbers and, in turn, to make good and rational decisions as future managers. The newsletter published monthly consists of a gamut of articles for readers ranging from beginners to advanced learners so as to further enrich the young minds understand the contributions made to the field of mathematics along with a couple of brain- racking sections of Sudoku to tickle the gray cells. For any further queries and feedback, please contact the following address: K.J. Somaiya Institute of Management Studies and Research, Vidya Nagar, Vidyavihar, Ghatkopar East, Mumbai -400077 or drop us a mail at quriosity.quantinuum@gmail.com Mentor: Prof. N.S.Nilakantan (+919820680741) Email – nilakantan@somaiya.edu Team Leaders: Purav Shah (+918511929416) VVNS Anudeep (+919441201685) Yatharth Jaiswal (+919969698361)

Editorial Team: VVNS Anudeep (+91 9441201685) Khushbu Mehta (+91 9930158610) Tanmay Nikam (+91 9699288587) Kaustubh Karanje (+91 7738219050) Dhyan Baby K (+91 9809245308) Shubham Thakur (+917096045088) Saumya Joshi (+919456684181)

28


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.