Pages

Sunday 8 September 2013

Alma analytics

Asaf Kline, Alma Product Manager, Ex Libris
[live-blog IGeLU Conference 2013, Berlin - please forgive typos etc.]

Roadmap goes from descriptive analytics to predictive analytics (=how should we take action) to prescriptive analytics (=what will happen). We're still at the stage of descriptive. This should help to have a deeper understanding of what's going on in your library. Alma is optimised for anlytics e.g. cost per use etc. We are doing a lot of work behind the scenes to make this happen. The model is a star schema, in the middle there is a fact and then there are different strands. It's optimised for reporting. There is a shared platform and institutions can share their anlytics data. It is also role based, so any report or dashboard created is sensitive to an individual's role.

There is a functionality to schedule and distribute reports. Any user can subscribe to the reports that they want to receive. Analytics work with APIs because for any report created you get an xml representation of the report and it can be sent to the programme of your choice. The analytics provide a history and that's how it can be predictive, because based on previous years, we can see what may happen (e.g. funds burn down).

Usage and cost per use. We take the counter info from the vendor and provide you information
from subject area and from within Alma. We use Ustat for loading in a vendor usage via a spreadsheet or sushi. When we know what's in your inventory and how much you pay for it, then
we can provide cost per usage data. Usage is on a title level but when libraries buy packages, we bring up the title data to the package level and give data for the package. The analogy is like a TV - we pay for lots of channels and may only watch two. What's important is the cost per usage, not so much what we watch. (Question here: we often raise purchase orders at package level but get invoiced at title level so that could be a problem?...).

This has been a central activity in 2013, now being rolled out to Data Centers with a continuing effort to ensure scalability of infrastructure. There are additional subjects that we're working on:
- Usage (Alma generated)
- Requests and booking
- Vendors and vendor accounts (more analytic in nature)
- BIB (bib data) - unfortunately analytics and MARC don't work well together so we will have tools to search for and process info that we have in the catalogue so at the moment there is the option to choose 5 local fields to get data from

Usage data is Alma generated, captured via the Alma Resolver, it will answer questions such as:
- What services were provided?
- What queries ended without any services?
- What were the sources/ip ranges that accessed the system?

We want to provide a set of tools to gain insight into the structure and use of your collection, e.g. print or electronic inventory and usage, but we're missing overlap analysis so e.g. comparing titles from a package so it's combining a system job that we run together with analytics. We want to take our tools to collection level, e.g. shelfmark/classification ranges, and embedd it/make it actionable so that it becomes part of the purchasing workflow, based on facts and analysis. We also want to start generating KPIs (performance measure), to help evaluate success or sucess of a particular activity, e.g. how much time it takes to process purchase order or vendor supply, avg % of items not picked up by user, request processing time etc.

The data will also be made available to be viewed on mobile devices. We want to bring it up to network level, i.e. cross institution reporting, using benchmark analytics (comparing to others "like me") and network analysis (members of a consortia), so this is more about disclosing strength/weaknesses/overlap and how collaboration can be improved.

No comments:

Post a Comment