Monday, April 26, 2010

Florida State U Transforms Reporting with Business Intelligence

By Dian Schaffhauser

Michael Barrett came to Florida State University in 2003 to manage a conversion to a new enterprise resource planning suite that was being implemented as a joint project among Financial Operations, Human Resources, and ITS. The university's move off of IBM's DB2 and onto Oracle's PeopleSoft software was driven by a 2002 Florida state constitutional amendment that mandated creation of a statewide board of governors responsible for the university system. Suddenly the state universities had to get off state-run systems that handled processing for finance and HR operations. The Tallahassee-based institution, with 40,000 students, is now running PeopleSoft Enterprise 9.0 as the replacement platform.

Online Management of Networked Information, or OMNI, the university's name for its ERP system, consists of an enterprise portal, financials, HR and payroll, and enterprise performance management. That word, "online," is revealing. Beginning in March 2008 Florida State has been in a continuing process to move off of third-party BI vendor and legacy tools for reporting and to open up its data systems to 1,500 active users on campus through several analytics tools provided in Oracle Business Intelligence Suite Enterprise Edition Plus (OBIEE). These include a query system; dashboards; Microsoft Office-integrated analytics; and "ibots," e-mail alerts sent when specific user-set conditions in the data occur.

The university estimated that it has saved $360,000 in license and maintenance fees as well as infrastructure and administrative support costs. Recently, Barrett--now the university's CIO and associate vice president for IT Services--discussed the business analytics aspects of the project he's been involved with for the last seven years.

Prior to that we were in a typical IT report shop mentality, where people would line up wanting reports, which we would deliver to them six months or a year later, depending on the workload and availability of resources. People couldn't be dynamic about the things they wanted. They had to think about them way in advance.

With business intelligence, a lot of offices go off and do their own reports. If it's not terribly complex, they can get some assistance from us, but they can turn things around in a day or two in a lot of cases. Before, it might have taken them months.

Schaffhauser: Had dashboards been in use at the campus before they were introduced with this latest initiative?

Barrett: Not really. It evolved from printed reports, of course--delivered printed reports. Then with the use of computers, it became more: Run your report and print it from there. We just distributed the printing functions out to the individuals who needed it. Then it became online viewing of reports with more flexible parameters with the ability to put in a search function and get a specific report so you didn't have to print out everything. But dashboards weren't really there. We didn't have software that provided them.

And I have to tell you, when we first started the project, a lot of people thought about it as reporting. We actually talked about it as a reporting tool, not a business intelligence tool, because the culture wasn't really ready to even think about what business intelligence meant and what the dashboards would mean. Now everybody uses dashboards, and it's just second nature.

Schaffhauser: What kind of training did you put users through?

Barrett: For developers and people who were going to be developing report queries, we did a series of offsite and onsite training sessions--generally about a week for people who were going to be doing a lot of query and report development.

For end users in the field, we just provided amphitheater type of training, where we invited large numbers of people and we just showed them the features on a big screen. We produced some training materials online that showed them how to click the buttons and some of the basics. "Here's what the new dashboards look like. Here are the reports. You used to get these before by going directly into the system and running a report. Now you're going to get it by clicking this button." Then we offered workshops to handle any issues where people thought they got it but were still having a little bit of an issue. They could drop into a lab-setting and get some help. We did that for about a month.

Schaffhauser: After you made the shift to PeopleSoft tools, did the reports your crew was generating simply emulate the previous generation of reports?

Barrett: They did to a large extent. We were driven by that because it's what people knew. That was part of the difficulty. A lot of what made the reporting difficult was that the particular format didn't lend itself well to the new PeopleSoft data structure. Things were very different once we implemented PeopleSoft. That created a situation where the old style reports--while it made sense to people in the old days and gave them a new sense of comfort--really didn't apply in the new world. But nobody stepped back to take a look at it and say, "What do we really need? What's the information content that's important to us?" That was a good bridge, because it helped people with a little bit of a comfort level from the old to the new. But now with the new dashboards, we're really providing information in a much more tabular fashion, more graphics, more pivot tables, and other types of analytical approaches that we never had before in the old system.

In fact, we used to generate those by downloading the data into Excel spreadsheets and providing those. We're actually going out ahead of what people were already doing in some cases anyway. When people saw [the dashboards], they recognized them: "Oh, I already do that today, manually. This is great. Now I just push a button and get it."

Schaffhauser: How did you establish the parameters for what those dashboards would consist of?

Barrett: We had a focus group team. Our governance is multi-faceted. We have people from our research office, administration, our financial group, our academic side. We got all those people together and talked about, what are the priorities? Our first phase with OBIEE had concentrated on a set of 12 or 13 subject areas and about two dozen dashboards we rolled out. That was our first phase. We said, if we get those out, that would satisfy about 80 percent of the needs of the average person in the university who just needs access to the data--if we could do that and be successful, then we'd move on and fine-tune some other areas, which is what we did.

Schaffhauser: It sounds like IT is still kind of in charge of setting up the dashboards.

Barrett: IT definitely has the warehouses and transformation logic to get the logic into the data warehouse. And that has to be decided by committee, and they need to know what the data looks like in the warehouse to get the data they want. That's one thing we definitely want. IT is responsible for the met a-data or data definition, which is what makes that data available to the end users. We're responsible for that.

We take care of making sure that everything runs. These things update generally every night, though some are real time. We do all the troubleshooting and make sure things run properly. Making sure all the data reconciles. The data has to be accurate, otherwise, people won't trust it. All of that we do in IT. Then we do the complex dashboard development for big projects, and we do the support for people when they want help doing their own dashboards.

Schaffhauser: Have people experienced dashboard fever?

Barrett: In some pockets, yes. Right after that initial rollout, we got people who were really excited about it and really wanted to push things in their research area. We did a big phase of rolling out a bunch of things in the research reporting area. We do a lot of grants and contract processing here. That's a little different from your day to day business of educating students.

More recently, we've really gotten involved in our student information systems. That was really the last leg of the three-phase project. We're actually finishing that up right now. We've created about 20 or so dashboards already for them in the last six months. It's expanding quite a bit.

Schaffhauser: Do you find the dials are getting smaller and smaller--to accommodate the growth of KPIs that need to be monitored?

Barrett: In the student information area, there are so many that we created a dashboard of dashboards. We have a main dashboard, and we split it up into 20 different subject areas, almost like a menu. From that dashboard you click and you go into another area that's more focused. There isn't a lot of need to cross over. The graduate waiver data doesn't need to cross over on the same dashboard with information on undergraduate admission. They're inherently different subject areas. So we tend to work from subject area approach. We try to keep them focused on the information they're trying to convey. Once users know generally what subject they're interested in going into, then when they get into there, there are dashboards that have multiple content that are related to one another that they might want to see on the same screen.

Schaffhauser: Is there still room at the university for traditional reporting?

Barrett: The other product we got with OBIEE is [Business Intelligence] (BI) Publisher. That created the one missing component--the ability to produce what we call traditional production reports and distribute them in various ways. We needed good quality and control of the output to make it look just a certain way. We have several reports that are like that. We use the BI Publisher tool for that.

All in all, [the business intelligence system] has become a one-stop shop for people to do everything from traditional dashboarding and queries to what I'd call production reports, as well as moving into the area of online or operational analytics. We're actually looking at real-time data directly from our transaction system using the dashboards because it provides a convenient tool to be able to look at it that way. We've got enough flexibility in the system to do that. People don't have to know where it's pointing. They just have to know the subjects areas and why they care about looking at that data.

Schaffhauser: Do you have any advice for making sure that the project is staying in alignment with the initial business requirements--those initial business goals?

Barrett: Certainly, if you have multi-disciplinary inputs in this, you need a strong steering committee or governance body to get updates from the project team and to take into account things that are changing and to make sure things stay on course, and to effect communications with all the stakeholders. I think that's very important.

Also the thing we did that I think made it successful was that we limited our scope in the early days. We had tons of ideas, but we said, "We've picked these 13 subject areas because they represent about 80 percent of our needs." If we could get 80 percent of our needs out in a reasonable project in less than a year and have it out in the field where people were using it, then we'd consider that success, and the rest would come in time. We didn't try to boil the ocean in this initial project, where it would have taken us years to get it out of the door, and frankly people would have forgotten why we were even doing it by the time we were done. So pick a reasonable scope that offers meaningful success and focus on that. Then use your governance to keep you focused on that through the project.

Schaffhauser: Eighty percent seems ocean-like.

Barrett: Twenty-six dashboards may seem like a lot. But, we felt, this is really going to have widespread university impact. It's going to go across a lot of areas. It really did. I was very surprised. The end user adoption of it was almost a non-event. We showed people the reports. We showed them the dashboards. They got on and just started using them. We didn't get pushback. We didn't get a whole lot of complaints. We had a couple of problems starting up, like anyone does. And we had to do performance tuning here and there, and the usual things you get into when you scale something up into an enterprise level.

But we've got a fairly sizable group of people using it on a fairly regular basis. We don't have a whole lot of support type issues with it. Most of the support issues we have are in the background. They have to do with data transformation. The users don't even see that. We just have to respond to them and fix them. That's not really about the tools. That's about data quality, and the issues you get into in general when you're dealing with large quantities of data and somewhat changing rules.

Overall, though, it's been very well received. And that's why we're moving ahead into our student area. We kept that for last because student-related information is our most critical from a business standpoint. We figured, if we could it from an administrative area like finance and HR, and it worked well there, then we'd get grounded in our experience and we could go and hit the student area. They wouldn't be the guinea pigs, so to speak.

Schaffhauser: So that's what's next?

Barrett: It's on going right now. We're due to get that third phase done around the end of June this year. That'll complete our initial suite. From there on out, it'll just be incremental improvements as needed.

Source:campustechnology.com