Integration. TM1 is not an Information Warehouse .... The Alteryx Download Tool Custom source integration for Adaptive Insights
Rich Text Commentary TM1 integrated with R TM1 integrated with Salesforce or Sugar CRM
THE
DASHBOARD
Tridant’s Online Magazine | Issue 05 February 2017
Editor’s Note James Wakefield
Director of Solutions and Marketing,Tridant
Are you a leader?
E
very organisation has information systems and around two-thirds of them are needing to blend traditional and cloud based systems in a hybrid cloud environment. Leaders recognise this as an opportunity, rather than an issue, and embrace the integration challenge to power their digital transformation and realise productivity gains. Every organisation needs to get the most out of their data. Leaders solve this through giving information access and providing the right data in the right format, at the right time, to enable decision making.
Issue 05
Every organisation need to be forward-looking. Leaders analyse the past, report on the present, and predict the future. Using data to predict the future may require data from multiple systems, both on-premise and in the cloud. In this issue of The Dashboard we focus on the theme of integration. At Tridant our aim is to help our clients be successful by providing them with accelerated decision-making capabilities as a real and sustainable source of competitive advantage. Integration lies at the heart of everything we do and so lets explore how to make integration easy. ♌
Editor’s Note
TM1 is not an Information Warehouse... Page 3
The Alteryx Download Tool Page 7
Custom source integration for Adaptive Insights Page 10
Rich Text Commentary for your Cognos Reporting Page 14
TM1 Integrated with R Page 19
TM1 Integrated with Salesforce or SugarCRM Page 22
Issue 5. Issue 05
Page 2 | Menu
TM1 is not an Information Warehouse
... it’s an Advanced Modelling tool!
TM1 is not an Information Warehouse ... it’s an Advanced Modelling tool!
We disagree with other IBM business partners who claim: ‘TM1 allows for a centralised Information Warehouse that achieves one version of the truth – not only for your data, but also your key business structures and business rules offering a data foundation layer solution based on TM1’.
A
t Tridant, we strongly believe that you should use the right tool for the right job. You cannot build a house with just a carpenter. You require a range of skills and tools throughout the process of planning and design, construction, and ultimately, ongoing maintenance and improvements. TM1 has been around for over 20 years and has an unparalleled loyalty from its user base, backed by a huge ongoing investment from IBM. As a result, TM1 is being used for many different purposes due to its unmatched flexibility, performance, and ease of use. With an inbuilt ETL tool, it allowed itself to be virtually independent of IT and started to take on the role of data transformation, as users would quickly iterate through business requirements and avoid waterfall project management. With all the core components out of the box, including strong ETL and BI capability, Finance and other Business functions are able to operate independently, often
to the frustration of the IT folks. The flexibility and rapid development approach of TM1 is great to provide an OLAP (cube) view of the organisation very quickly. Many large, medium, and small organisations have benefitted from this. But! The learnings from the TM1 build should then be applied back downstream to create a data foundation layer supported by good governance. You shouldn’t leave TM1 to morph over time into an Information Warehouse and you should never try to load all of your transactional data into it. Just imagine trying to operate a grocery chain without a suitable distribution centre, or an airline without organised catering. A number of key technology changes have emerged over the last few years that have changed the landscape:
The Tridant collect-manage-use methodology outlines our approach:
Issue 05
Page 4 | TM1 is not an Information Warehouse
• Cloud • Big Data • APIs All of these 3 have integration as a key theme around their success.
Cloud The rise of Cloud and Software as a Service has been a monumental shift in the industry. Mega Vendors are now worried about startups due to the low cost of entry into the industry. TM1 has evolved into IBM Planning Analytics in the cloud and offers many benefits. That being said, other technologies have also moved into the cloud at the same time. The concept of a data warehouse being a long scary project, costing a fortune, is no longer valid. You can provision IBM dashDB as a cloud data warehouse, load your data, and be querying huge volumes of data within hours, not days. Integrating your TM1 with a data warehouse (cloud or on-premise) will benefit you in the long run. IBM Business Process Management and IBM Blueworks Live are available in the cloud, meaning complex workflows can be documented and automated in those systems. IBM App Connect Professional (Cast Iron) can be used to synchronize systems of record either on premise or in the cloud. Master Data Management is also available in the cloud. Cloud is no longer scary, but it is important to integrate your on-premise systems with your cloud systems.
Big Data Big Data seemed to just be a buzz term for a while, and we all wondered whether anyone was actually doing it. Well it is real, and companies in Australia
Issue 05
and abroad are using Big Data to gain insights into their customers. Facebook has now moved to Apache Spark in a big way and IBM through Bluemix can now allow you to provision Hadoop or Apache easily. This ‘learn and burn’ approach means that you can fail fast, learn, improve, and go to the next step. As ‘The Internet of Things’ evolves you will find data will get bigger and more integrations will be required. Review your architecture and roadmap, and take advantage of the right tool for Big Data – A TM1 cube is not the place to be storing Big Data.
APIs Communication via ODBC seems old school now. Every SaaS product out there has an API allowing you to extract the data you need in JSON format, or even push data in. APIs hide any backend complexity of data sources, making it easier to get data and perform integrations. You can use an API tool like Alteryx or Cast Iron to easily extract data from SaaS systems, or if you have the skills you can custom build it in Java and load directly into TM1 cubes.
Conclusion The aim of this article is to educate. It is important to understand we are not discrediting the many use cases of TM1. At Tridant, we have the experience and depth to know when not to use TM1 and when to use a Data Foundation Layer, and that’s key to the success of any performance management journey of an organisation.
Page 5 | TM1 is not an Information Warehouse
Use our
TSIM methodology
for your TM1 projects Operate
Discovery
Analysis
Build
Design
Initiate
Test
Execute
Deploy
Close
Tridant has developed its own proprietary project management methodology based on a number of proven methodologies and combined with a wealth of experience in delivering successful performance and information management solutions. Tridant Solution Implementation Methodology (TSIM) is a common sense approach that has taken into account all of the lessons learned over multiple implementations to ensure our customers have a successful outcome. The methodology consists of a mix of waterfall and agile methodologies to ensure the project delivers on time and on budget but is flexible and adjusts to the business needs quickly.
30+
Custom Templates
20+
Proven Processes
Contact us on info@tridant.com.au to learn more about TSIM
The Alteryx Download Tool
I
f you are using a Software as a Service (SaaS) application, then you will likely at some point want to extract data from it for analysis or reporting purposes. Any good SaaS application will have an API (Application Programming Interface) written for it. Web service APIs that adhere to the REST (Representational State Transfer) architectural constraints are called RESTful APIs. In a RESTful Web service, requests made to a resource’s URI will receive a response in XML, JSON, or delimited text file. Using HTTP (Hypertext Transfer Protocol - the foundation of data communication for the World Wide Web) you can use GET, POST, PUT, DELETE commands with parameters. RESTful APIs provide fast performance and reliability. Continued >
Issue 05
Page 7 | The Alteryx Download Tool
Alteryx provides many out-of-the-box data connectors for SaaS applications like Salesforce and Marketo.
If you would like to connect to your own SaaS application, which does not have a standard out-ofthe-box connector, the Alteryx ‘Download’ tool is the perfect option to explore via a simple dragand-drop interface. With the Download tool you can create your own connector that authenticates, downloads, transforms, and outputs the data all in one workflow. The best way to start with the download tool is to create an ‘Input Data’ node and add a field for the API URL. You may create other fields like ApiKey here depending on the API you are using.
Drag the Download node to connect to the Input Table Node. Now we can set the URL to Download to be the URL from the Input Data Field. Continued >
Issue 05
Page 8 | The Alteryx Download Tool
[Most API requests to download will typically use the GET HTTP Action and then have some parameters in the URL to define what data to retrieve. Here you can see I specify the GET action and also tick the apikey field to be dynamically added to my URL. The result of the GET API call will typically be either in XML or JSON format. Alteryx provides the ‘JSON Parse’ and the ‘XML Parse’ nodes that you can connect to the Download node. These XML/JSON nodes make your life simple for navigating through the response format of the API and then either performing further data work on it or outputting the results to a file/database. At Tridant we have found the Alteryx Download node to be an invaluable tool for extracting data from SaaS applications. We have Alteryx workflows currently available for IBM Marketing Cloud (Silverpop), Mailchimp, Harvest and SugarCRM, while also working on many others. If you would like us to build an API connector for you in Alteryx then contact your Tridant Client Executive or email info@tridant.com.au
Issue 05
Page 9 | The Alteryx Download Tool
e c r u o S m o Cust r o f n o i t a r Integ ghts
Ad
i s n I e v i t ap
A
daptive Planning has always offered flat file (e.g. spreadsheet) data integration capability through its wizard-driven import facility, as well as custom pre-built adapters to popular ERPs and CRMs. These pre-built Adapters provide fully tailored integration solutions but are often more expensive to implement, as they require using the Adaptive specialist consulting team. What many Adaptive Insights customers might not realise, is that there is a free integration platform that they can use for either importing data into Adaptive or for exporting your Adaptive data back on-premise. The Adaptive Integration module is the core building block to the Adaptive Suite. It allows Adaptive Planning, Discovery, Consolidation and Reporting to seamlessly integrate with any data source, cloud or on-premise.
Adaptive Insights Suite Issue 05
Page 11 | Custom source integration for Adaptive Insights
The data integration process is as simple as 4 key steps: 1. Identify data sources where data extraction is required – Adaptive have a number of pre-built connectors to data sources including spreadsheets and custom on-premise and cloud sources including Intacct, Salesforce, Microsoft Great Plains and NetSuite.
2. Load select data to a staging area and perform the necessary data manipulation.
3. Map staging data columns to existing Planning accounts, organisation levels and dimensions.
4. Load data from staging area to Adaptive Planning
The secret sauce enabling BI -Directional data transfer is Adaptive’s Pentaho Kettle plug-in, where you can write Kettle scripts that can export data from Adaptive Planning into the underlying source system.
Issue 05
Page 12 | Custom source integration for Adaptive Insights
Pentaho can connect to more than 100 cloud or on-premise databases which include Oracle, SAP ERP systems, MySQL and MySQL server. It also offers capabilities through a graphical ETL designer and a rich library of pre-built components to access, prepare and blend data, as well as powerful orchestration capabilities to coordinate and combine transformation. This includes notifications and alerts. Pentaho can prepare and blend data as required by Adaptive and use inbuilt Adaptive connectors to load data to Adaptive suite. This makes integration much simpler as there is no need to mess with APIs or additional scripting to write back to the Adaptive cloud.
Key benefits • With Integration – The customer has customer-controlled integration. They can create and manage their data imports end-to-end. • Integration provides a common data platform for the entire Adaptive Suite. The same data platform can be used to bring data into Planning, Consolidation and Discovery Enterprise. • Integration is the common data platform used by Adaptive Professional Services teams, customers and partners. Professional Services teams/partners can deliver on integration solutions, which can then be managed by the customers. Alternatively, customers can also build their own integrations on the same platform. • Integration as a data platform is specific to Adaptive, including out-of-the-box business rules for loading data into different Adaptive applications. • Integration provides the ability to set up schedules/auto extracts for both on premise and cloud-based systems. • Integration supports Adaptive Pre-Configured Adapters for specific source systems (NetSuite, Salesforce, Intacct, Microsoft Dynamics Great Plains). These adapters make it easy to extract the most commonly used data objects for the specific source systems. • Integration provides a cloud-based staging area for click-not-code and lightweight data preparation. • Optional Pentaho (ETL tool) data integration component with Adaptive-specific plugins, used for advanced data manipulation and preparation
Tridant are experienced and certified Adaptive Integration experts so feel free to contact us if you would like a demonstration or wish to extend the capabilities of your existing implementation.
Issue 05
Page 13 | Custom source integration for Adaptive Insights
Rich Text Commentary for your Cognos Reporting
T
here are a number of custom commentary options for your Cognos reports. Here, we introduce another option for you to consider that utilises the integration between IBM Cognos and IBM Cognos TM1. If you have Cognos BI and TM1 then you have a perfect combination of tools for variance analysis and commentary. Cognos BI can provide your Enterprise-grade financial reporting to hundreds or thousands of users in your organisation based off TM1 holding your Actual/Budget/Forecast with variance calculations. In a true performance management practice you will be tracking departments against their Actual to Budget/Forecast spend and, if outside a tolerance, asking them to explain that variance.
TM1 has always been able to store text, but the rich text capability has been limited. By rich text, we mean that people are able to format the text with bold/italic/underline, or select fonts, size, alignment etc. This might seem like a small thing, but users increasingly expect it due to their familiarisation with web editing on Gmail and Office365. Cognos BI is typically deployed for READ only reporting scenarios i.e. no write back to any underlying source systems. Although in the case of commentary, we do want to write back the text into a storage area, and include the formatting needed.
A video demonstrating how a Cognos BI report can include a simple column for commentary, with a pencil icon indicating that you can write to the cell. Issue 05
Page 15 | Rich Text Commentary
When you click on the pencil icon, you are then taken to another screen, where you can enter text with any rich formatting you prefer. The example below demonstrates bullet lists, bold, italics, colour and highlighting.
Upon pressing the ‘Update’ button the rich text commentary is stored back into TM1 at the cell intersection defined. You can see how the measure cell is actually storing the formatting code that is generated.
Issue 05
Page 16 | Rich Text Commentary
So how is this solution put together? • On the TM1 side, there is a TI (Turbo Integrator Process) that accepts variables to determine at which intersection in the cube the commentary should be written against. • On the Cognos BI side, the variables are passed from the selected row into parameters using the BI prompt API, the values in the parameters are passed to javascript variables which are then passed to the TM1 TI using the TM1 Web API and some jquery scripting. • The TM1 TI runs and stores the commentary entered with all the required rich text formatting code. • This is then displayed in BI just using the Richtext object. The rich text editor itself is a freeware plugin called CKeditor (http://ckeditor.com/). The editor itself can be customised in anyway to fit with your requirements (i.e. only a bold and italics button or the full suite).
Issue 05
Page 17 | Rich Text Commentary
Do you want sexy Cognos BI Dashbaords like these on your TM1 cubes?
Contact us on info@tridant.com.au to speak to the best Cognos BI team in Asia Pacific
T 1 TM1
M
R
Integrated with R
T
he R language has come a long way since its creation by Ross Ihaka and Robert Gentleman at the University of Auckland, New Zealand. In 1995 the R language was released under the GNU General Public License thus making it effectively free for people to use. Today the R language is widely used among statisticians and data miners for developing statistical software and data analysis. The R language runs on almost any standard computing platform/OS (even PlayStation). Being community supported, you don’t have the backing of a large enterprise and
Issue 05
functionality is based on consumer demand and reliant on users contributing to the code. As data analysis becomes more and more important as a competitive differentiator in organisations and the role of the data scientist evolves; there is a continued rise in the usage of R. Although people might be scared by R being a scripting language, there are many freely available courses and a number of front end studios that make adoption easier. The most popular front end for R is https://www.rstudio. com/ which you can download and use for free. This makes R easier to use as it includes a code editor, debugging & visualisation tools. If you don’t want to worry about installing R yourself and connecting it to your data then you should take a look at IBM dashDB. The advantage of IBM dashDB is that it is a cloud data warehouse where you can store your data. It also has R Studio pre-installed which you also access in the cloud.
Page 20 | TM1 Integrated with R
So with TM1 being a budgeting, forecasting, scenario modelling tool, why would you be interested in integrating it with R? Well with R being a predictive modelling tool, it can calculate with a degree of confidence, a future set of values for you. This allows you to set a baseline that gets adjusted, rather than starting at zero. Predictive methods can help you with common business questions like: • Customer Acquisition – show the drivers for keeping customers • Customer Churn – why are you losing customers and what you can do to retain them • Customer Segmentation – separate customers with similar characteristics into groups for further analysis • Customer Win back – what to do to win back lost customers • Market Basket Analysis – determine which groups of products are commonly purchased in the same transaction To showcase how TM1 can utilise the power of R we thought it best to have a video showcasing the capability:
In the video there are a few key callouts we should mention: • R is installed on the same computer in this demo • Assume Rserv is running • Based off 12 months of actuals in TM1, R performs a Square Root calculation • TM1 calls R via the ExecuteJavaN TI command and passes in the parameters per each line of the data source • Java eclipse is used to create the connection to R • Java eclipse is used to call the R calculation, receive it in a variable and then pass the result back to TM1 via the CellPutN command • Changing the TM1 Measure to hold calculation and the R calculation method Although this is a simple demo, it opens up so many possibilities for organisations to enhance their TM1 models with R or to consider the possibilities of advanced analytics usage. Issue 05
Page 21 | TM1 Integrated with R
TM1
integrated with
Salesforce or SugarCRM
Y
our CRM contains so much vital information for the health or your organsiation. It’s your heartbeat, where if the data within your CRM is of good quality and the database is growing in size then your organisation is likely in good shape. A CRM like Salesforce or SugarCRM will also contain sales opportunities against your CRM contacts. These sales opportunities will be defined by certain attributes i.e. sales person, region, and product, which allow you to report on your pipeline. These pipeline management
tools though are designed to only report your pipeline today without any historical values. If you want to see how opportunity records have changed over time then you would need to extract the data into a data warehouse in order to track the slowly changing dimensionality. The revenue $, sales stage and confidence % are also vital measures in your pipeline but the system will lack the ability to do any What If forecasting. You might for example want to do What If models around: • Changing confidence % against certain large opportunities. • Delaying close dates of opportunities into next quarter • Changing sales stages of opportunities to model overall sale stage forecasts in the future Continued >
Issue 05
Page 22 | TM1 Integrated with Salesforce or SugarCRM
TM1 provides the perfect engine for doing this type of What If modelling but how do you go about integrating TM1 with a CRM like Salesforce or SugarCRM which is cloud based? Most Software as a Service applications have an API (Application Programming Interface) which allows you to authenticate to your account and then issue queries and receive results. To enable TM1 to talk to the CRM via API this can be done via Java. Essentially a TM1 user will execute a Turbo Integrator Process that passes parameters into a Java .jar file. This jar file is compiled code that then handles the authentication to the API, the sending of the API query, the retrieval of the data in JSON format and then loads data into TM1 via the Java TI command codes. To setup your environment all you need to do is to place the .jar file in your DataDirectory\} javaextensions\user folder and then set the JavaClassPath parameter in your tm1s.cfg file
To call the .jar file you then create a TI process and use the ExecuteJavaN command passing in the parameters of the Java class to call and the logon details. N.B The password is blanked out in the screenshot below
Issue 05
Page 23 | TM1 Integrated with Salesforce or SugarCRM
The table below shows the Java class “salesforceopportunity” accepting the parameters of the Salesforce logon and then calling another java class “SFEngine”. It will also do the TM1 cube and dimension builds but first the “SFEngine” class will do the Salesforce magic.
Issue 05
Page 24 | TM1 Integrated with Salesforce or SugarCRM
You can retrieve data from Salesforce API using SOAP and so this function handles the creation of the Salesforce SOAP query build, execution and clean up. Once this class has finished it passes the clean result set back to the Java class that handles the TM1 load.
Java easily handles the creation of cubes, dimensions and elements via the same commands you are used to in TI including CellPutS and CellPutN.
Issue 05
Page 25 | TM1 Integrated with Salesforce or SugarCRM
The result is a two dimensional cube that can be refreshed as often as you like. Now that you have this salesforce dataset in TM1, you can create other cubes to model your Opportunities and do What If scenarios.
Tridant has also written the same capability for SugarCRM and is happy to write other connectors if you would like one for your SaaS system. Tridant is planning on making this Java code freely available on Github. If you would like a copy then please email info@tridant.com.au or contact your Tridant Client Executive.
Issue 05
Page 26 | TM1 Integrated with Salesforce or SugarCRM
Tridant Industry Solutions Leverage your data to become more responsive to market conditions and achieve your organisation’s objectives. Tridant’s Industry Solutions can unlock new value and sustainable performance improvement across industry sectors. Our unique solutions are the result of years of industry and functional expertise, coupled with a deep understanding of data and its importance in lifting performance.
Learn more today at: https://tridant.com.au/solutions/tridant-solutions-industry/
Contact Us Melbourne (Head Office)
Brisbane
Sydney
Hobart
Level 3, 1180 Toorak Rd Camberwell VIC 3124 T 1300 737 141
Level 4, 50 Clarence St Sydney NSW 2000 T 1300 737 141
Adelaide
Level 2, 70 Hindmarsh Sq Adelaide SA 5000 T 1300 737 141
Level 19, 10 Eagle St. Brisbane QLD 4000 T 1300 737 141
Level 6, Reserve Bank Building, 111 Macquarie Street, Hobart TAS 7000 T 1300 737 141
Singapore
16 Collyer Quay, Level 18, Singapore 049318 T +65 681 89262
Perth
Ground Floor, 45 St George’s Tce Perth WA 6000 T 1300 737 141
www.tridant.com.au E: info@tridant.com.au 1300 737 141
Canberra
Level 1, The Realm, 18 National Circuit, Barton ACT 2600 T 1300 737 141
The Dashboard | Issue 05 - February 2017 Designed and Edited by Dilan Gunawardana Copyright 2017 Tridant | About Tridant
Subscribe to The Dashboard