Is a high velocity agile and resilient software artifact assembly line.  Wishful thinking?


Is that a myth?  Am I asking for the impossible?

Is there a way my in-house programmers can deliver continuously on my ever changing analytics application needs at a time line which is always “AS OF YESTERDAY!” and at a fixed cost that I know upfront?.

As an energy and equity trader (and recovering programmer), I have always struggled with the time it takes my IT team to deliver a small yet important piece of software that allows me to analyze an aspect of the natural gas, crude oil or equity market and take advantage of fleeting opportunities that may arise.

For a long time, I would spin up an Excel spreadsheet, coupled with some expert level use of formulas and an even more daunting use of Visual Basic for Applications (VBA) to stand up a quick “analytical application”.   Record a few macros to “automate” the manual tasks within the spreadsheet most of which were cut/paste and range copies.  Then I would get a little smarter and create functions in VBA as time permitted so I could trigger the analysis on the click of a button. That would work great for my immediate needs.  “I love MS Excel, my swiss army knife on the trade floor ” – I admit I am in love ! ..

This works for a while, very soon I get dependent on the spreadsheet based application and extend it further – more ideas, more intense analytics.   All nice and dandy, until the day comes when my beloved Excel spreadsheet, the data there in and the analytical demands on the calculation engine cannot handle the sheer complexity anymore.

I arrange a meeting with my IT guys, those “brain yanks” sitting at one end of the trading floor.  They agree to a “meeting” a week from now.  A week later the meeting occurs, I meet with a Solutions Architect and a Director of IT.   I detail out my needs on the white board.  My tryst with MS Excel and VBA is met with disdain and dismissal that my analytics platform is not “a real application but a spreadsheet”.  They promise me that they will come up with a better solution that is more robust.   I walk away in anticipation of a robust, resilient, yet agile and responsive application that can change on a whim.

Two weeks later a business analyst walks up to me asking me to detail out what my needs are.  I sketch out what my spreadsheet does, and send my spreadsheet over to be “analyzed”.   A couple of weeks later, I have another meeting with the Solutions Architect, Director of IT and the business analyst.   I am handed an elaborately published “Business Case” document and asked to vet it for correctness, before they can kick off the analysis phase for the application.

I read through the document and realize that it has gone through three iterations and reviews before getting to me.   I diligently read through the contents (most of which is boiler plate mumbo jumbo I do not understand), and finally get to a point where things start to make sense.  “But wait”, is this not a more elegant visual representation of my doodles on the board at my first meeting with the Solutions Architect and IT director coupled with the “back of the napkin/printer sheet” doodles I handed to the business analyst.

I approve the content at the next meeting a couple of days later.   A follow-on meeting is scheduled for Friday afternoon, which I reschedule for early next week.

I attend this meeting, only to kick off the next “phase” of the project called the “elaboration” phase.  I have a barrage of questions that I have no answer for or even understand why they are relevant.

“Who has access to this data?” (required for IT / Data Governance), “What is the required SLA ?”( no clue what that is ), “Any performance metrics that are critical ?”, “Do you want a web based solution or a desktop solution ? ”

I go … “Gee Whiz!!!.. I dont know! “. All I want is a piece of software that gets these data sets from the EIA website, another set of data continuously from a vendors ftp server that I have subscribed to, a third dataset quantifying the weather sent to me in an email attachment, a tool that does a multivariate linear regression on the above data sets to predict todays residential and commercial demand based on the above datasets.

The IT team troops off to “analyze” the requirements.   Three weeks later, I attend a meeting where now there is the Solutions Architect, the IT Director, the business analyst and Software lead.   I get handed a “Design document” that details out the guts of what the application would look like, the team size (7 in all – a project manager, a business analyst, and architect and four programmer – one lead and three others ), an elaborate gantt chart ( with a 7 month time line) ….. ANNNNNND .. wait for it! ..

THE COST …. a cool half a million dollars to build.   “There goes my bonus ! “…. I FREAK !.. and I am out of there in the next five minutes.  I am going to make do with my beloved Excel Spreadsheet “Application”

SOUND FAMILIAR ! … Been there done that at least a 100 times.  But there is a silver lining ..


THE SILVER LINING

Remember I said earlier .. As an energy and equity trader ( AND RECOVERING PROGRAMMER )…. yes, I was on the dark side for a very fruitful 11 years as a techie.

So I dust off my rusty programming skill set, and go about finding a solution that is agile and flexible as my Microsoft Excel based application.

One that I can change as often as I want to, yet resilient and robust enough so I can run it autonomously and unattended on the companies server infrastructure without much ado and the traditional “compile, build, deploy, run cycle” that traditional high end enterprise strengths are built around.

Don’t get me wrong, the repeatable software delivery process works great for an application where in requirements are known up front, set in stone and have a useful life of three to five years or more.  Not so much in the analytics space where the only constant is change and the life span may be no more than 6 months on average.  I need something quick to build throw away applications with limited utility at a low-price point.


Enter That GEEKY guy who hangs out around the coffee maker and junk food vending machine !

I walk up to this geeky programmer sitting in a corner cubicle far removed from everyone else.  I have run in to him previously at the coffee maker and the junk food vending machine and he has avoided all forms of communication outside of the proverbial “HEY!”.

I ask him if there are options that are as flexible as my beloved Microsoft Excel, and as robust as the C++ he uses.   But something where I do not require an advanced college degree in Computer Science from MIT (He has a ph.d in artificial intelligence and machine learning ) that will meet my needs.

This new found geeky friend of mine (who seems to have the latest Beats wireless headphones permanently screwed on to his head) sends me a note asking me to break down my requirements into elementary and atomic tasks.  Tasks that do one thing and one thing only.

He walks me through breaking down my problem space in to data acquisition, data preparation, data storage, analysis tasks, visualization tasks, dissemination tasks, orchestration tasks and operational tasks.   I do that without much ado.

Then he implements each task independent of the other.  Each script accomplishing just one aspect of the analytic application, I am looking to build.  On completion of the scripts, he runs each one of them on a schedule that is most appropriate and repeats on a daily cycle as is required.

Now this was not that hard was it.   I do not have to run the analytics manually any more as they are run overnight as I sleep.  I just come over to work in the morning, hit refresh on my spreadsheet on my “Main” spreadsheet and then kick off a macro to update my graphs, pivot tables and dashboard I built myself in Excel.


IT TOOK ALL OF TWO WEEKS TO GET THIS WORKING .. WOW! …

I ask, can this be taken one step further.   I want to build a forward-looking view for S&D based on the weather forecasts and 10-year normal weather.

My new-found friend goes, “Yeah, no big deal! “, a significant number of tools that you need are already in place.   You just need to build a few more elemental tasks and cobble them together differently using the scheduler and they will all work in parallel.

Three weeks later we were done and I am a happy camper.

We have incrementally added on a significant number of analytical “use cases” (another buzz word I picked up) in a short amount of time.  And we are yet to hit the half a million dollars in expenses that was proposed initially by my IT team for just one analytical use case implementation.

So to answer my own question – “Wishful thinking on a natural gas trader’s part ?: A high velocity agile and resilient software artifact assembly line for energy trading analytics. ”

The answer is a resounding “YEEEESSSSSSS!”.  With a departure from best practices and a little out of the box thinking it is possible to build an agile and responsive, yet robust and resilient platform that is positioned between MS Excel desktop based applications and ETRM or ERP grade robust and resilient enterprise applications.   Layer in elemental software artifacts that are atomic in nature, do one thing and one thing only well.  Develop incremental artifacts as needed.  Cobble some or all of these artifacts together drive by a schedule, event or manual stimuli to accomplish some analytical task that generates usable pieces of information for trading decision support.

As a shameless plug, my dear geeky friend has now developed a small autonomic analytics project platform called Energiewerks Ensights.  The platform is currently being used to implement use cases targeted at Natural gas fundamental and technical analysis, Natural gas European and American Options pricing, Natural gas market risk management, Natural gas Storage optimization, confirmations and several other applications I am not privy too.