You may also be interested in:

Articles

How to Switch the Finance Process from Propeller to Jet Speed

  • By Gauthier Vasseur
  • Published: 11/6/2017

jet
Just getting by with spreadsheets is no longer an option for finance and treasury. The performance advantage gained by efficient analytics is just too large to be without impact, no matter how smart the analysts are. Following a recent workshop at a banking institution, I observed a sixtyfold time reduction of an analytics process, from 20 hours monthly to 20 minutes. It is like competing in a 42.2km marathon gaining a 41.5km advantage over our peers (the ones with manual and spreadsheet-based processes). Even the best athlete won’t ever beat you.

With such an advance, any runner would have the time to ponder his/her stride, breathing or pace. Any athlete would have the opportunity to plan the next race and the preparation for it. Any individual would enjoy the opportunity to connect with the crowd and the media. Not only with such an advance, the runner would win by a crushing margin each time, but also he/she would already secure a competitive advantage on the next moves. Sounds like pretty unfair, doesn’t it?

Overcoming the fear of change

What changed in Virginia’s (we’ll call her Virginia) approach to deliver this dramatic process optimization? How could she shift from 20 hours of lagging reporting production to 20-minute pro-active risk and opportunity analytics that changed her role?

It just took two days of training to trigger this pivoting. None of what Virginia learned was rocket science: like her generation and many before; she just had never gone through classes that actually taught business applied data management. She crossed the chasm, overcoming her spreadsheet induced behavior and her fear of change to acquire the base knowledge she had missed for so long.

Virginia was in charge of supervising bank reconciliation from 50 subsidiaries around the world. The diversity of systems and the lack of process standardization could not deliver full reconciliation automatically. Even though she was aware that not much insight or operational feedback was created by the analysis, she had to deliver it every month.

In less than a week, she transformed 50 spreadsheets with monthly tabs into a single data table, with well-defined dates and identifiers. She set automated data collection where possible or leveraged flat file exchanges when interfaces were too complex to program. She designed automated reports that could then be not only sent to management in an aggregated manner but also to every subsidiary as operation and decision support. She started to secure large data sets that would record the ins and outs on bank accounts, when, how fast and by who they would be reconciled. She naturally started to ask new questions about risk, fraud, compliance. In just a few days, she had already pushed her analytics boundaries to the next level and was already itching to apply statistics and pattern recognition to her data treasure trove. And all it took was for Virginia to learn notions that were way within her league.

She discovered the data dynamics. Data is like any material. It is collected in different shapes and forms and must be transformed, assembled and delivered to a recipient. There are ground principles that must be known to handle data properly and with efficiency, such as:

  • The data supply chain structure from producer to consumer
  • The notion of tables, keys and joins
  • The concepts of fact, dimensions and master data
  • The basics of master data management, data transformation and quality
  • SQL language logic.

She opened up to new solutions. Spreadsheets had been her one-trick pony for every analytics challenge so far. She realized that she could easily harness the power of new solutions such as:

  • ETL (extract, transform and load) to capture the right data and deliver automated preparation
  • Relational databases and data modeling
  • Master data management.

She demystified the big data hype and clearly identified what she needed from a business standpoint. For her and all of her colleagues, big data was never going to be about volumes (even 5-10 million rows is small data) or about velocity (analytics could afford the lag of a few hours, if not a day, and the team was not able to react in real time anyway). It was about connecting the dots between all of her information sources and making sense of it. Interestingly, by avoiding the lure of the big data buzz, she could actually become a much stronger and educated supporter of it. She turned into a dependable driver for these projects as she came with not only clear business questions but also curated and connected data she had amassed for her core analysis.

Virginia knew she needed to reverse the 80/20 rule between data processing and analysis. She had just created a new one: the 20-20 rule where 20 hours of tedious manual analytics becomes 20 minutes of high-value insight delivery.

Gauthier Vasseur is an instructor at Stanford University. Learn more here.
Showcase Your Expertise
Share your solutions, best practices and big ideas with your treasury and finance community when you lead an educational session at AFP 2019.
Submit Your Proposal

Copyright © 2018 Association for Financial Professionals, Inc.
All rights reserved.