Login     Register

        Contact Us     Search

Why XLeratorDB is the right tool for the financial crisis

Mar 12

Written by: Charles Flock
3/12/2009 7:19 PM  RssIcon

A brief discussion about the advantages of XLeratorDB over the combination of EXCEL and SQL Server, or, as we like to put it, a proof of the equation:
XLeratorDB > SQL Server + Excel
The first spreadsheet program that I truly embraced was Lotus 1-2-3. It ran under DOS, had what we would now consider to be a very difficult user interface, and let me do things that had been previously unimaginable. At the time, I was working in a large commercial bank on Wall Street and while the banking environment was changing rapidly, the bank’s systems were not. And Lotus 1-2-3 was perfect for filling in the gaps. It allowed us to satisfy the demands of the traders without requiring a huge investment in technology, training, procedures, etc. It was like a gift from Olympus, a veritable lightning bolt, which is why I should not have been surprised when the bank took the official position that it could not be used as the books and records of the bank. Way too much flexibility; too hard to audit; too hard to keep track of; complete and utter anarchy as far as they were concerned. This more or less convinced me to get out of banking and into the software business. Twenty-five years later, though, I have to admit that they had a point.
Of course, Lotus 1-2-3 is long gone, and EXCEL has almost become synonymous with spreadsheet. And it is an extremely powerful tool that has its place in every financial institution. One of my clients in the 90’s used to say to me that his bank, Bankers Trust, had made more money off of EXCEL than Microsoft had. And given the performance of Bankers Trust at that time, I had no reason to doubt him. But Bankers Trust blew up in 1998 and was acquired by Deutsche Bank. 
Database management systems (DBMS) on the other hand, are all about stability, recoverability, and auditability. First, to even let someone see data you have to grant them permission. There are multiple levels of permission; some users can just see data. Others can see and add data. Still other might be able to delete data.
Second, everything that happens to the database is logged. The DBMS allows you to control processing in an orderly manner so that data is never inconsistently rendered. It keeps track of everything. It is easy to schedule backups of the data so that you can see what it looked like yesterday, or the day before, or a week ago; or whatever time period you want to make available. If anything untoward happens, power failure, communications failure, other types of emergency, generally, the only risk is that you lose the data that was being processed right at the time that the event occurred.
Third, everyone looks at the same data. You can grant access to data to as many people as you want and you can even create data bases that are used for inquiry purposes only, to reduce the overhead on your transaction processing systems that use the data base.
Finally, DBMS are really powerful. They are capable of processing hundreds of thousands of transactions a second.
The only problem with DBMS, is that they are much harder to use than spreadsheets. Generally the person responsible for administering the data base, the DBA, has professional certifications and is generally only responsible for the data base management system, not the data contained in it. To actually manipulate the data, or extract it, you need someone who can use the scripting language that DBMS use; SQL. SQL stands for Structured Query Language (not Standard Query Language, which would be more useful).
That’s the basic dichotomy. Spreadsheets are very flexible, contain a substantial amount of functionality allowing you to manipulate data, and are excellent at handling ad hoc data analysis and presentation, and for storing the current analysis on a local PC or network. DBMS are much more powerful in terms of raw computing power, provide a much greater degree of stability, recoverability, and consistency, and are great at handling millions and even billions of rows of data.
XLeratorDB combines the best of both worlds. By using SQL CLR, XLeratorDB takes all of the familiar spreadsheet functions and makes them available in SQL Server. It only became technically possible to do this in SQL Server 2005. I am going to avoid getting into the ugly technical details, but with SQL CLR you can use any managed .NET language to create functions on the database itself. Managed code is compiled to .NET assemblies and after being verified for type safety, registered at the database. After that, they can be invoked like any other procedure. Aren’t you glad I didn’t get into the ugly technical details?
This brings a tremendous amount of power and flexibility into the database, which we think can fundamentally change the way you think about processing data.
Here’s a real world example. A private equity firm has multiple pools of investments which generate multiple cash flows and needs to calculate the return for these cash flows. The cash flows are stored in a SQL Server data base. They are then exported to EXCEL; the data is then manipulated so that it can be passed into the XIRR function. The results of these calculations are then stored on the data base. This process took about 40 minutes.
By using the XIRR function in XLeratorDB, they were able to write a single piece of SQL which produces exactly the same results directly on the data base. Processing time was reduced to less than a minute. And as volumes increase, the processing differential will increase dramatically. In other words, EXCEL will slow down much more quickly than SQL Server will. And it is much easier to throw more processing power at the SQL Server data base than at EXCEL.
Couldn’t they have just written a stored procedure? It turns out that for this particular client, they had been looking for a solution that ran on the database for quite some time, and they had ruled out the possibility of writing their own XIRR function because it was too complicated. XIRR is a complicated function, requiring root-finding algorithms and iteration to come up with a solution, but that’s even a more compelling reason to put it on the data base where there’s more computing power.
Consider this problem. Assume that you have 150 million loans in your data base and you want to calculate the principal and interest payments for each one for next month. How would you go about doing it? 150 million rows is too many for EXCEL. However, using XLeratorDB you can process about 25,000 per second (assuming an off-the shelf 32-bit machine with 4G of memory running Vista). So it would take about 100 minutes to perform that calculation. And, by using a more powerful server, I am sure that you could get the calculation to under an hour, maybe even substantially under an hour. Now you are in position to calculate the impact of a rate change or a yield curve shift on 150 million mortgages at basically the press of a button. And if you know how to use EXCEL, you will be able to do this in SQL Server using XLeratorDB.
And, here’s the real beauty. What if we added 10 million loans to the database; what would we have to change? Nothing. Since we are using a scripting language, we enter the formula once, and it is automatically applied to all the data that meets the criteria. In EXCEL, we would have to import data, copy and paste formulas, hope that we didn’t delete anything, hope that we copied formulas to all the right cells, and generally have a much harder time of it.
And that’s why XLeratorDB is the right tool now. Every organization is analyzing more data and looking at analyses that it has never done before. And if EXCEL is the tool that is used, it is going to be a slower and more painful process than it needs to be. If you can do it in EXCEL, you can do it faster, more reliably, and with more data, in SQL Server using XLeratorDB.

Tags:
Categories:

Search Blogs

Copyright 2008-2024 Westclintech LLC         Privacy Policy        Terms of Service