Employee salary information is very confidential, yet many companies maintain salary and bonus information in Excel spreadsheets, and often email these files as attachments to large distribution lists. A simple oversight by someone could put this information in the wrong hands and cause headaches for the HR department and other managers.
We did a project for a company to simplify their annual salary review process. With our solution, Human Resources did not have to send out Excel spreadsheets containing confidential data to each manager. Instead, each manager received a blank “Compensation Adjustment” spreadsheet that was connected to a SQL Server database. Each manager simply retrieved their employee data into the spreadsheet. XLhub’s back-end logic made sure managers got to see the data only for their direct reports. The managers then entered salary adjustment recommendations in the spreadsheet, and instead of emailing the file back to HR, they simply “pushed” the data to the SQL Server database.
Since all managers pushed their data into the same database table, reporting was extremely simple. Additionally, confidential data did not travel through the email system. And, no one had to individually consolidate data from multiple Excel spreadsheets.
Scott Maier is a faculty member and professor of journalism at the University of Orego. He has over 20 years experience as a wire-service reporter, and his “beats” have spanned from city hall to the state legislature, Latin America, as well as a variety of other news beats. He is founder of CAR Northwest, “an industry-academic partnership providing training in computer-assisted reporting to newsrooms and journalism classrooms.” Professor Maier was kind enough to chat with us about his experience with Excel, and how XLhub might be a beneficial tool for even newsroom journalists.
XLhub: Would you say you use Excel on a daily, weekly, or longer basis? Does it vary depending on your story/project?
Professor Maier: I use Excel at least weekly for managing grades and research data.
XLhub: When working as a reporter, to what extent is it necessary for you to know Excel to perform your basic job duties?
Professor Maier: When I was a working reporter, I found my Excel skills to augment my basic reporting and to place me in a special class of reporters who could go beyond traditional daily news coverage.
XLhub: Would you say your strong understanding of Excel has enhanced the quality of your work? Has it given you an edge over other journalists?
Professor Maier: Excel gave me a tool to identify and quantify information and patterns not readily apparent in the flood of data now readily available online. With these skills, I would be assigned to report and write the “big picture” story that put me on Page One.
XLhub: Can you recall a specific project or type of task where you used Excel to improve your work as a journalist? What method(s) did you use?
Professor Maier: Excel was vital for examining thousands and thousands of individual campaign contributions and for detailed analysis of demographic change documented by the U.S. Census.
XLhub: Have you ever felt limited by Excel? Why?
Professor Maier: Excel is great for many things, but database work and statistical analysis often requires other software. (And I wish Access were available for Mac’s, the computers used in many journalism classrooms and newsrooms!
XLhub: XLhub is an Excel add-on that combines data from multiple spreadsheets into a single repository, where it can be consolidated and analyzed by multiple users. From the standpoint of a journalist and/or someone working in a newsroom, how could you use such a product?
Professor Maier: I see good potential for such a tool.
Here’s a tool for assessing the complexity of your Excel spreadsheet model.
If your model is complex, as a spreadsheet design guideline we recommend separating these three key features of Excel:
- Data Storage
The more tightly coupled are the reporting, calculation and data storage, the more unwieldy and fragile your spreadsheet will be.
The criteria we use for measuring the complexity include:
- Number of Users: How many people need to collaborate using the spreadsheet?
- Process Complexity: How complex is the business process that is supported by the spreadsheet?
- Calculation Intensity: How much number crunching is performed by the model?
- Data Volume: How much data is contained in the spreadsheet?
- Data Change Frequency: How frequently does the data in the spreadsheet get added, deleted or modified?
- Data Model Complexity: How many different topics are included in the model and how complex are the relationships between the topics?
- Analytical Intensity: How much slicing and dicing of the data is required?
Take a look at this list before starting your spreadsheet model and as you expand it. A model typically increases in complexity over time, and if you want to avoid re-building a model because it has become too fragile, consider the basic spreadsheet design guidelines here.
I would like to share some interesting information I learned from the Spreadsheet Engineering Project. This research project was undertaken at the Tuck School of Business at Dartmouth College. Professors Baker, Powell and Lawson conducted the research.
Quoting from their site, the purpose of the project is to “is to improve the design and use of spreadsheets by individuals and organizations.” Their research focuses on “identifying best practices in spreadsheet development (design, testing, documenting); procedures for implementing, modifying, sharing, and archiving spreadsheets; and organizational policies relating to standards, training and quality control.”
Even though I have been around spreadsheets for two decades in large and small organizations, and can say “been there, done that” regarding many spreadsheet application scenarios, there’s a lot I haven’t seen. Looking at the survey results, there are several surprises where the survey shows data that’s not what I would have guessed.
This is the link to the project site: http://mba.tuck.dartmouth.edu/spreadsheet/index.html
The surveys were conducted in 2005-2006, with about 1600 respondents. Below are some selected questions from the survey.
Ever been through a PowerPoint presentation with impressive charts and tables explaining trends in your company’s operations? Did you start to wonder where is the data that was used to create the charts?
The data (and the chart) probably came from and Excel spreadsheet on someone’s desktop. When the chart has to be reproduced for next month’s presentation, it can only be created by the author because the knowhow required is not recorded in some metrics document or standard operating procedure.
The same can be said about discussions in a 10K or 10Q filing, or a presentation to Wall Street analysts. How can an auditor validate those numbers if he doesn’t have access to the specific spreadsheet? How can a CIO, overseeing a vast business intelligence infrastructure, reproduce the same numbers in case the numbers are challenged?
Excel’s power and ease of use leads to the creation of “mini BI silos” where data from operational systems and even BI reports is further combined, analyzed and reported. The informality of these BI silos causes auditors and CIOs a lot of grief. They cannot put the kabbash on these Excel-based solutions because they aren’t able to provide a better alternative.
As business intelligence consultants, we come across Microsoft Excel where ever we go.
Excel is a fantastic tool. It is easy to learn, incredibly flexible and has powerful data management and business intelligence features. Our clients have sophisticated enterprise systems such as SAP, Oracle and Microsoft Dynamics that process massive volumes of transactions. Our clients also have the leading business intelligence tools in house. It is not uncommon to come across Teradata, Cognos, Business Objects and Microsoft BI products in the most environments. Yet, despite all the technology available within the enterprise, the employees continue to use Microsoft Excel.
One thing I find amusing is that a user will generate a report in Cognos or Business Objects, then immediately export the data from the report to Excel. Once the data is in Excel, it will be filtered, sorted and analyzed in a pivot table. Graphs will be produced and pasted into PowerPoint presentations. Often, additional columns are created alongside the exported columns to compute ratios or display categories. These new columns will then take a life of their own and will be used to make tactical and even strategic decisions.
The chain between the ERP system and Excel on the desktop can include a replicated database, an Operational Data Store, various ETL reporsitories, a dimensional data warehouse, cubes and the report. Borrowing a term from the telco industry, Excel is the “last mile” between decision workers at one end and the large enterprise transactional systems at the other.
My prediction is that Excel will continue to be an important part of the BI landscape. Microsoft continues to add impressive new features to it, and along with PowerPivot, it will fills a space that other enterprise tools cannot.