Data First for Anaplan Applications and Data Orchestration

About 65% of planning effort is spent on data orchestration—finding, blending, and transforming data. Iver van de Zand, VP of Product Management, will show how Anaplan simplifies integration with core systems to streamline this critical part of the planning process.

Iver van de Zand 0:00:08.7:  

My name is Iver van de Zand, Belgium, and I look after calculation engines, also Anaplan Data Orchestrator, CloudWorks, and CIO Essentials, so governance, security, and what have you. And I think I really enjoyed the session, he's here even, of [?Ivo 0:00:26.9], where he talked in the plenary about his thinking on a scalable platform. He talked about AI, and if you really carefully listen to him between the lines, he would say, you know, data is super important. Without no trusted AI, without trusted data. And then he picked up, and I also like to talk about that huge, huge announcement, where we have seen an additional $500 million investment in product on top of our regular budget. So I repeat, $500 million on top of our regular budget, just for product. And I think that says quite a bit on how serious we are on making the product even better, and that is what Chris and I are going to talk about. And, you know, if you look, if you take that $500 million, and again, Ivo already shared a little bit, you know, you could, if you would categorize how we're investing over there, where do we place our cards, it's basically four areas. And Chris and I are going to talk you through those areas with a data-first approach in mind. Before we go there, before we talk product strategy, I think it's good to just circle back one year, just go back 12 months and look at, Chris and I were debating, what do we put on the slide? You know, there was so much last year, but just to take out a few things. Polaris is there. Super successful. I don't know if it was 85 customers, almost 100 customers using that technology. We brought workflow. 

 

Iver van de Zand 0:02:05.7: 

Workflow to the product, allowing you to design your applications for performance. Workflow forces you to design and work in a very automatic and process-orientated way. We brought geospatial mapping, over 100K downloads already for geospatial mapping. Countless applications, I call out to territory and quota, supply chain applications, but you will hear a lot today about all the pre-built applications. And of course, there's also CIO essentials. Think about separation of duties, for example, that we brought to the market, better security, better governance. But today, if I double-click a little bit on that 500 million, and again, if you would ask me or Chris or Ivo, you know, where are the cards? Four areas, four. Four key areas where it all goes. It has to do with building better applications, more use cases, and ensuring those applications are connected. Why they are going to connect, Chris will talk about that in a second. Connected applications. We will talk about natural dimensionality, planning at any grain where you want, planning in a way that you want your business. No limitations by a system, but you can plan the way that you want. We will talk about Data First from an orchestration perspective. 

 

Iver van de Zand 0:03:36.3: 

Again, I repeat, 65 per cent of every planning application, whether it's with Anaplan or another technology, is in data orchestration. End of the line, no debate. Those are proven facts. 65 per cent, go to your own planning project. Do the math. Roughly 65 per cent is in data prep, finding the right data, wrangling it, transforming it, and pushing it into your models. And of course, we will talk about Anaplan intelligence, but a little less because Ivo took that in very detail in a plenary session this morning. Well, let me go through a little bit. And again, very strong when we have discussions in the team about where we innovate, what capabilities we are going to build next, what we put on the roadmap. It's all the time from a Data First emphasis. Why? Why, to let Anaplan intelligence do its thing. Why, to have applications talk to each other? Why Data First to allow you to do end-to-end business intelligence? Well, I can continue and continue building those reasons, but Data First is a very strong, I would say, thinking model in the way that we decide what we put on the roadmap and whatnot. 

 

Iver van de Zand 0:04:56.4: 

Well, those four areas are going to pass the desk with a little less attention on Anaplan intelligence, but I'm going to start with number one, natural dimensionality. I called it out on the slide, planning everywhere at any grain. And what we have done with Polaris, we have almost 100 customers so far successfully using Polaris in production, but a few other things going on. All the applications that we are building and that we are planning to build are de facto on Polaris. The way that we bring new capabilities and functionalities are in Polaris. Does that mean we have no attention for Classic? Absolutely not. Classic is there. We will never decommission it. It will be there. Super successful. If you are successful with Classic, please use that. If you look at the platform, calculation platform of innovation, that is Polaris. And the numbers are staggering. 500 quadrillion cells in some of the use cases in production. That is ten to the power of 15. I have a data warehouse background. I'm even more impressed by the other number. 

 

Iver van de Zand 0:06:10.6: 

I don't know if you ever modelled a data warehouse or been involved in that stuff, but running a dimension with a response time less than three seconds of four, almost 500K members is ridiculous. It is very, very, very impressive. And it means that if you see a little bit where we came from with Polaris in such a short time, launched in 2023, already here where we are right now, you see some of the top labels that are using this technology for all their planning use cases. That is very, very impressive. On Demand Calc is now there. Keep an eye on the public messaging. There's quite some good stuff. Dave Harding, Dave Smith are in the room over there. They own Polaris. Talk to them, but there's stunning content coming up also for On Demand Calc. If I move a little bit further, double-clicking on Polaris. Polaris and Data First, I think I called it out very clearly. So Polaris was built for Data First. What we're planning no longer starts from a summary, but from the war atomic signals. I read it literally over here. 

 

Iver van de Zand 0:07:20.6: 

Plan at the level that you want, the way you want your business. And yes, that does mean that Polaris plays in an area where density and sparsity of data is completely ignorant for us. You can want it if you want. With Polaris, there's no trade-off, no trade-off to one planning at any scale, whether it's financial planning, which has for nature a certain, I would say, summarized grain of data versus supply chain planning or demand planning, which has a way lower grain of data. You can all bring them together in one engine. And of course, that helps us doing the magic with Anaplan Intelligence. And of course, we are currently, as we speak, very closely working, having ADO seamlessly talk to Polaris to make sure that we can do those things as end-to-end business intelligence and real-time Anaplan Intelligence. So these are the things for Polaris. I also wanted to call out, I'm not going to go in a list, but there are some staggering, staggering innovations coming up. 

 

Iver van de Zand 0:08:31.0: 

We will share the slides, by the way. You see some of the functions over here. Again, the guys are in the room. But a few ones that I want to call out, keep an eye on performance monitoring that is coming in the second part of the year. It is a game-changing technology that we will embed in Polaris, but also multi-select, for example. It was one of the top-ranked questions from all of you that we are now bringing to Polaris also for the second part of this year. All you see on the screen is 2025. In terms of the things we have for next year, I can't say too much, but there's really, really cool stuff for it. Polaris is the calculation engine of innovation. So if I move further a little bit, remember we had those four investment areas. Ivo spoke about Anaplan Intelligence. We had natural dimensionality, I just touched that. We had applications within the middle. There was data orchestration. 

 

Iver van de Zand 0:09:32.4: 

And data orchestration today is handled by a flagship product called Anaplan Data Orchestrator. What Anaplan Data Orchestrator does, it is a full point-and-click solution. It plays in this area. This is a logical architecture of a typical planning application. You see on the left-hand side the sources where the data typically comes from. Could be S4, could be a workday for HR, could be success factors, could be Salesforce. Typically, you know those sources or data lake, what have you. And what we are doing with Anaplan Data Orchestrator, we play in the area of the orange-colored areas, connecting to data, finding the right data, joining and blending the different sources, transforming them. What do I mean with transform? Aggregate, split up columns, align certain functionalities, build in formulas, what have you. Anything you need to do to prepare your data to be consumable by the Anaplan models. And then we disseminate and push the data into our Anaplan models. 

 

Iver van de Zand 0:10:43.6: 

Now, you might say to me or to Chris or what have you, hey, wait a second, Iver, that sounds like an ETL tool. To a certain point, you are correct, but there are a few, few very significant differences. And I'm going to call out two. Number one, opposite to a typical ETL tool, we store the data. Why do we do that? Listen to the Anaplan Intelligence story. You know, we want to have that foundational layer to put our Anaplan Intelligence on top. Think around end-to-end BI that you might want to do. You need that data. Think around providing your data as a product in the near future, pushing it out as a packaged product to your customers or your colleagues or what have you. That is why we keep, store the data. Second reason why Anaplan Data Orchestrator is slightly different from ETL tools is that the way that we push out the data is tuned and tailored for Anaplan models. 

 

Iver van de Zand 0:11:51.7: 

We can directly populate lists, hierarchies and models, even in parallel. And there is no other ETL tool can do that. So for us, super essential. And if you, you know, if you look a little bit, yes, you are probably all I assume Anaplan experts or you have been working with Anaplan technology. But a few important things I want to call out when it comes to Anaplan Data Orchestrator, because we make a point and click solution opposite to something that requires coding, scripting or what have you, we can easily reduce the effort of orchestrating data into Anaplan with 50 per cent. If you compare this to a situation where you use multiple data hubs, the effort that you need, the skills level of the people, how quick it works, how easily your change transforms, how easily you can see when something needs to be changed. That is way more complex than it is with a point and click solution with widgets driven like Anaplan Data Orchestrator. Second big difference with what we so far did in the past. Now we start pushing the data into the models. And of course, the smart people under you already figured it out. Of course, of course, we team up ADO with workflow. Because then I can schedule and automate the pushing into my models completely, which we can do right now. Yes, Anaplan Data Orchestrator seamlessly works together with workflow. Third reason, because it's widget driven and it's a pipeline generator, we have full lineage, meaning that I can follow the whole path of a data point in a table, in one of the zillion sources you have, through the whole transformation flows, exactly knowing what model it is, what reports using that model, meaning if something changes in the source, I immediately know what I need to adapt and what people I need to inform if something is wrong. 

 

Iver van de Zand 0:14:05.4: 

That is what lineage is. Very important to maintain your setup in ever-changing circumstances. Now, if you compare it a bit, I took a few screenshots. On top, you typically see the setup of a data hub. With Data Orchestrator, it looks like this. No complex setups anymore, fully setup pipeline orchestration. Same comparison. On top, you see a data hub, which in fact is a model. This is how it nowadays works in ADO, which is fully widget-driven. If you look a little bit on the screen, and I think I have a video where I show it, imagine I click this widget over here, where my finger now points at. It will automatically highlight all the dependencies from other widgets, so I have that trace that I talked about, I have that lineage that I talked about automatically. Now, Anaplan Data Orchestrator plays a quite dominant role and will play an even more dominant role in the Anaplan setup. One of the things I wanted to call out is that that same Anaplan Data Orchestrator is used as the technology to bring our consolidation and planning data together. Yes, we have now with Anaplan, already since a year, we have full financial consolidation, state of art. And of course, we want the actuals coming from consolidation driving the plannings and what-if scenarios. When we do that, we ensure that integration using the Anaplan Data Orchestration technology. Today, that's one direction. 

 

Iver van de Zand 0:15:55.1: 

The other direction back is via workflow, but you will see on the roadmap that we make that fully bidirectional. And that means that we can now offer, with the integration of [?Konso 0:16:06.0] and the Anaplan platform using ADO, we offer a single go-to place for the office of the CFO. And that is very, very, very powerful. Well, I already called out some of the things that I feel are super important for you to remind when you talk, you debate Anaplan Data Orchestrator, but this is really taking control of your costs and lowering them. That lineage and that trust in your data, that full transparency end-to-end is very, very important. And you simply develop quicker because it's widget driven, point and click, trace at any moment in time when you develop your applications. What you see behind me is a little overview of what Anaplan Data Orchestrator today does. Chris and I quickly made this video yesterday evening. You can connect to all the key sources that you know, whether it's Google BigQuery, Snowflake, [unclear name 0:17:07.2] what have you. All the big sources are there. We are expanding that way of connecting to databases. And the way that you connect is basically threefold.  

 

Iver van de Zand 0:17:17.9: 

You either connect to one of those sources. Second, you connect to a CSV file. And three, very, very impressive. You can use an Anaplan model as a source as well. Think of the use cases over there. That's very, very powerful. Again, everything like you see happening on the screen, widget driven. Yes, so you can easily on the right-hand side, you see all the literacy, all the metadata. And over here, I am currently building a transformation. I think in this case, we choose to aggregate. And once I've done my transformations and I follow that whole trace, I do the mapping towards my model. And I finally decide via scheduler or what have you to populate the data. Yes, so this interface of Anaplan Data Orchestrator, you see it happening over here, is very, very powerful. One of the things, the use cases, is that you can choose a dashboard. You see, I just use an action button to trigger an ADO workflow, meaning that from within the dashboard, I can easily build a button and say, 'Hey, can you please upload the source data?' I want to have the latest version of the data. This is the decision tree where you can see all the widgets coming together. Well, what are we building? 

 

Iver van de Zand 0:18:37.9: 

Basically, the investments for the short-term, and I'm talking the next six to eight months, are on the screen. As a write-back, we are investing in write-backs. So not only connecting to source data, but also writing back. So you get S3, Azure SQL is coming. Like I said, we are building this bidirectional financial connector to consolidation. Azure Blob, Google BigQuery, and S4, we already connect to all of these sources, but we're also building the write-back. Super important investment area, just when you come back from your summer holidays, assuming that you take your holidays in July, when you look into Anaplan in August, you will find data spaces. Data spaces, those of you who are familiar with hyper-scalers, cloud technologies, data spaces are a common concept. I can add certain pieces of my data in one space, and other pieces of my data in another space. What are the use cases? Think around, I have all my development data in a development space, all my testing data in a testing space, and production data in a production space. So I build an automatic lifecycle management. 

 

Iver van de Zand 0:19:52.9: 

Second use case, Chris wants to run marketing. He has all his users and marketing data in a marketing space. I am running finance. I have all my data used in the finance space, and we share some commonalities, interchange data. So the data space concept, super important. Third topic, we invest a lot in connecting to data. So you will get next month, as FTP and Redshift. We're also building a public API to push data into ADO. Think of some of you are using technology like Boomi and MuleSoft. Using that public API, we allow you to push using those kinds of technologies to put data into ADO. Fourth investment area that we are working on is the performance. Performance is already five times better. We have very, very strong performance, also at large volumes. But for us, we want to put the bar even higher. So there's a number of areas where we invest in performance. Before I close down, a few things I want to say. Lifecycle management, I already called it out, bringing the data space concept. We will gradually, later this year, build in lifecycle management. And also, a few things I really want to call out on the public API. Because we have the public API, we also consider, for example, to hook in Anaplan Connect into those APIs. So I hope this gives you a glimpse on where we are with Data orchestrator. But there are a few other things that are super important. And Chris, I know you live and die with applications. I know you have a fantastic roadmap. So thanks for making time. And maybe you can tell us a little bit about the applications. 

 

Chris Marriott 0:21:44.3: 

Absolutely. Yes. So good afternoon, everybody. My name is Chris Marriott, part of the product team, as Iver said. Thank you for the introduction. Iver has described to you two incredibly powerful capabilities of the platform that we are putting in your hands, at your disposal, to plan more granularly at a larger scale than ever before. Combine that with what Iver took us through this morning with the AI capabilities, and we're arming you with enough for the next ten years to keep you busy on all the use cases that you could possibly ever want. However, there is an easier way for you to adopt all of these capabilities, and that's where applications come in. We're bringing the time to value down from 12 months to 12 weeks, as you saw this morning. As little as seven or eight weeks in some examples and some use cases. So applications are products that are available on the Anaplan platform that bring all of this together. So we're going to talk about apps for a moment, and then we're going to talk about how we connect those together. I'll bring Iver back up momentarily. So you saw this slide this morning. I wanted to reiterate really what Dane was saying. We have used the apps branding before at Anaplan with the App Hub. Applications are not what you would find on the App Hub historically. 

 

Chris Marriott 0:22:57.9: 

These are real, configurable, upgradable products that you buy and purchase and set up, configure out of the box to get rapid time to value, and that's the secret sauce in the sense they are configurable and they are upgradable. I'm going to show you a video in a second of the application framework that you heard about this morning, and that's what makes these things special and turns them into real products. Iver said that all applications are built on Polaris. True. Here is an example and a use case of one of the supply chain applications. So when it was deployed originally on Classic, we were dimensionalizing those modules at the product group level and at the customer group level with about a million combinations consuming 300 gigabytes of workspace. We've moved that over to Polaris to unlock higher dimensionality and lower granularity. Now we're over thousands of individual products, thousands of customers at 13 million combinations consuming just 30 gigabytes of workspace. 

 

Chris Marriott 0:24:04.9: 

So we're looking at 10 times finer grained planning application consuming just 10 times the space than the original. Iver talked about the statement direction, lots of exciting stuff coming down the track for Polaris, but Classic is not going away. Not on the slide, but one thing I can share today, we are going to be by the end of the year enhancing our lifecycle management capabilities. We are going to be introducing the ability to do what we call partial model promotion, which is something that a lot of customers have asked for for a long time, selectively migrate change from your development into your production environments, equally applicable to both Classic and Polaris. Lots of innovation still available coming down the track. So the whole idea then is that we're going to deliver these applications as part of the platform. They're going to be there for you to switch on, hydrate quickly, test, iterate, and evolve, and integrate into your systems much, much faster than you would if you were building these use cases yourself from scratch. 

 

Chris Marriott 0:25:04.3: 

So I'm going to show you the secret source now. This is the application framework. It's a new area of the platform that you will get access to where you can consume and manage the applications. What we're doing here is we are configuring our data structures and our hierarchies of the individual application. This is an abstract layer, essentially. We've pulled away some of these configurations that are based on how you answer these questions, then impacts the resulting implementation that gets deployed into your environment. So we've got data structures, we've got hierarchies, we've got business processes, we've even got down to terminology. For example, if you don't refer to your expenses as expenses, you refer to them as cost. Updating that here will then be reflected across the entire application install and deployment time update once, you know, see that benefit everywhere. This is not a one-time thing, either. 

 

Chris Marriott 0:25:57.2: 

If year over year your planning processes change, or if you make some acquisitions, you change your hierarchies, whatever it is that impacts your business and how you operate, well, then you come in, you reconfigure, you redeploy and you upgrade over the previous deployment, all part of the application framework. So super exciting, super revolutionary and transformational in how we deploy Anaplan out into environments. Lots more to come here, watch this space. So Dane said this morning we've got 12 applications at the moment. We've recently launched four more. I'm not going to repeat what he said, but just to kind of reiterate the message that 30 applications by the end of the year, and that doesn't even include the ones that our partners are going to start to develop as well. As we go into 2026, you can expect to see a lot more applications really cementing that journey of being able to deploy many use cases rapidly at the start of your journey. So Polaris, ADO, AI, applications, but they're just Anaplan models, right, at the end of the day? Well, they're more than that because we're going to connect them together in a way that we've never done before. So leveraging Data Orchestrator as the vehicle to be able to do this, ADO is then going to be that connective tissue, the glue between all of our applications, allowing you to pull them together via a unified data model, shared data, shared metadata, common objects essentially across the different applications. So Ivo, we've got these 12 applications today. What does it mean in order to bring this together? 

 

Iver van de Zand 0:27:38.3: 

I think, I really like your story, Chris. You can think, imagine, just try to imagine for yourself, I have a use case for inventory planning or I have a use case for capacity planning or for financial planning. You could think around, if you think about those use cases, they most likely share a number of key business entities and those could be currency table, Chris, or it could be a SKU table or it could be a cost center table. The commonalities that they share, we call them a kind of a bus, the unified key business entities of those applications. And Chris already told it to you, guess where we are storing those data models? On ADO. The unified data model, the way to glue all the applications together is done through ADO, Chris. And I try to visualize it over here and I remember all the discussions, Chris, that we had on the unified data model. You know, I can make this slide very, very long, but you can think of yourself what the advantages are of such a unified data model. I can in one place for all my use cases, ensure that those key business entities are consistent, apply everywhere. I can ensure in one place that they are all configurable. If I want to add a hierarchy level, I change it over there and it applies to all applications. I have one place to manage that unified data model, Chris, and I think it's on the slide and I just want to emphasize on this. It's my enthusiasm and I think it's very important. 

 

Chris Marriott 0:29:30.1: 

It is also your slide, so I feel unfair of me to present this one. But the one thing I will add as well is the extensibility part here. So not only does this allow us to more effectively build additional applications in a consistent manner, whether they're Anaplan built, whether they're partner built, they're all going to be built on this common foundation. But what it also allows is then for you to extend the applications themselves in your environment on top of what is a unified data model driving that consistency, that quality that Iver was talking about there. So some key takeaways. I don’t know if you’ve heard us mention data at all today. It’s kind of important to everything we’re doing, right? So data, data, data, and I loved the analogy this morning from the AWS, from Phil, about data being like milk. I think we’re going to steal that one. Enterprise scale decision making. You’ve seen the power of Polaris, we’re realizing that, we’re baking it into the applications, we’re committed to Polaris as the future, and we are bringing it all together in the form of connected applications. Thank you very much. 

SPEAKERS

Chris Marriott, Senior Director, Product Management, Anaplan

Iver van de Zand, VP Product Management, Anaplan