Agility unleashed: OpenAI Scales for Exponential Growth

In a rapidly evolving AI landscape, agility is key to business success. Join our fireside chat to explore how OpenAI is shifting from spreadsheets to dynamic planning for faster decision-making and operational agility, supporting sustainable growth in a competitive world.

Bill Howell 0:00:07.1: 

So, thank you, Jeremy and Hank, for joining. We really appreciate it. Danielle gave a great introduction to each of them so I won't go into any more detail - other than to say: in his spare time, Jeremy loves to cook anything fresh, hot and delicious. Hank loves to do home improvement so if you need your kitchen done, Hank is your guy. But both have been in this ecosystem for a long, long time and appreciate their insights. What I thought I'd do is have Jeremy and Hank walk through the OpenAI journey to kick us off, talk a little bit about how they came to choose Anaplan. Then we'll ask a few questions, leave about ten-to-fifteen minutes at the end for Q and A from the audience. So, prepare your questions and be ready to ask some good ones. With that, I'll turn it over to you two gentlemen. 

 

Hank Woo 0:00:52.0: 

Thanks, Bill. Before we start, a quick introduction on OpenAI and PwC. I'm sure you guys know us individually, but PwC and OpenAI, actually the relationship is 360. We are one of the first and the largest enterprise subscribers for OpenAI. Then we also consult with OpenAI on some of the largest strategic and operational initiatives. Then we go to market together, building applications that help enterprises and so on and forth, so with that… 

 

Jeremy Stern 0:01:27.5: 

Yes, so as Bill alluded to, we're going to talk a little bit about why Anaplan and why PwC. As you can see from the slide, immense growth at OpenAI over the last few years. The big highlight number we want to point out is over 500 million weekly active users. We've been seeing a force multiple of that number every year and really every couple of months. So, we had to figure out how to get out of the old world of planning, out of Excel and out of G Sheets because that many users, this much revenue growth and this many new initiatives could not feasibly be planned - or effectively be planned - on G Sheets. So, we ended up going with Anaplan, and we'll talk a little bit later about what we've done within the tool. 

 

Hank Woo 0:02:20.2: 

All right, so further setting the context of PwC and OpenAI's partnership on expanding Anaplan use cases, so there was a project entailing eight weeks of end-to-end development time. I'll let that sink in for a second; eight weeks. It included building the whole P and L as the house placeholder, so load, financial plan information including some standard non-workforce modeling capabilities. CapEx was built in there, depreciation and then also to wrap it up, allocation. Dynamic multi-step allocation was built there. We'll talk about it, but Jeremy and team had built the headcount planning model. But we came in and teamed up together to build the price equations of the workforce plan PxQ model in the simplest terms. So, we helped build that, the comp model and so on and forth to complete the P and L package. Then from a UI reporting perspective, dimensionalized BVA and along with some custom management and gap reporting. 

 

Jeremy Stern 0:03:35.5: 

Again, as Hank mentioned, we were all over the place and figuring out where to start. So, the first model we undertook - our workforce planning model - you can think of it as more of an operational model. I know all the finance people in this room are probably antithetical to just looking at numbers of heads and not the dollars associated with those heads, but we had to start somewhere. Over six weeks, we built out this operational workforce model. We integrated it bidirectionally with our HRAS, with our ATS. Really, it was more of a breadth than depth model. It now updates on an hourly cadence and gives our recruiting function, our HR function and any people manager a real-time view of their org. We also create all positions out of this model. So, a big pain point for us was - as I mentioned earlier - everything was coming out of Google Sheets, including every role that we were going to hire.  

 

Jeremy Stern 0:04:34.9: 

We were growing about three X year-over-year for a few years there, so that ended up not working out very quickly for us, so we built out this workforce model. Again, it creates positions, manages positions and gives the entire org not only a view of their organization, but also a view of what Anaplan can do. Then a natural next step was to move on to the income statement. That's what PwC and the team have helped us build over the last eight-to-ten weeks. We have full integrations end to end on this model. We're doing top-line forecasting, looking at doing unit economics, a full view of all of our vendors. My personal favorite part: the big cube. Polaris has allowed us to look at almost every intersection that we would've wanted, which blends really nicely with the way that we use our tools as well. A big limiting factor that we were thinking about when we initially considered Polaris was: is this too much information? Is this too much granularity?  

 

Jeremy Stern 0:05:42.9: 

But what we realized very quickly is that the seamless integration between these large cubes and our tool, allows us to very quickly find the big spikes, the big dips and really the important insights that we need to do our job more effectively. Then looking to the future, over the - what - combined 16 weeks that we've done Anaplan development, we've learned a lot. We have a very, very long roadmap so a few of the items that I'll call out on the slide here. One of the big ones - as I'm sure you guys may've heard - we have a big initiative called Stargate that we need to start planning out, a big combination of CapEx and compute planning. We say at work that compute is destiny in this space. So, hopefully this is a bit of a window into our destiny; trying to figure out where to get compute, when we can get it and where it needs to go.  

 

Jeremy Stern 0:06:36.7: 

As I mentioned earlier, equity management, gross margin, long-range planning, the rest of our financials and then the big one; revenue planning. I was joking with this team earlier that, in order to build a revenue model, you first have to know how to build the revenue model. That seems to change monthly for us. So, PwC and the Anaplan team have been a huge help in working through this thought process with us and figuring out what to build and when. Then finally, a couple of quick use cases I wanted to highlight. As I'm sure you guys can guess - and as I'm sure a lot of you out there have been hearing from your leadership - we're trying to really push this concept of AI-first principles. We heard earlier in the earlier talk that AI-first is going to be the future. We're trying to really bootstrap different use cases in ways that we use it.  

 

Jeremy Stern 0:07:32.3: 

When asked, 'How do we use our tools in line with Anaplan,' we were able to bucket into these three main buckets. First, everyone's favorite; documentation. Documentation, as I'm sure anybody who's built a model here knows, can be the bane of your existence. But what we ended up doing was, instead of having to redo every PDF, every doc, every video, we just fed all of it into ChatGPT, built a specific GPT for it. Now this is where everyone goes for access information, for usage information, for quick tips. We can also seamlessly integrate the great learnings that Anaplan has available on their community into this GPT. So, documentation is first. Next is retrieval. What we do is, we try and feed in all of the outputs of our forecast into the tool. It can give us immediate insights or answers to simple questions.  

 

Jeremy Stern 0:08:29.5: 

As I'm sure everyone in this room has experienced before, when an executive asks you, 'What's this number, what's this number going to look like, what's going to be our headcount 13 months from now,' unless you are much better at your job than I am, you probably don't have it on hand. Instead of going to you, they can just go here and you can focus on your day to day. Then finally, the big one that we're still really early on but figuring out new and unique ways to leverage, is analysis. Obviously, you've heard of all of these wrappers that allow you to make slides, that allow you to make different decks. Well, we've tried to move all of that out into just one window. Now we do a lot of our presentations just directly in a ChatGPT window where we feed in exports, we feed in information. We can have live Q and As and conversations through this chat window and it will spit out outputs. It will spit out our analysis and our conclusions. So, these are the three ways that we think about in the short term we'll be leveraging AI and our tool with Anaplan. But ultimately, we wouldn't be able to do this without the level of granularity and consistency that Polaris and the PwC team have enabled us. 

 

Bill Howell 0:09:44.2: 

Hank, anything to add to that? 

 

Hank Woo 0:09:46.5: 

We'll touch on that. Jeremy touched on some of the OpenAI use cases, but PwC in collaboration with OpenAI, what we are developing is using OpenAI AI capabilities to actually accelerate some of the SDLC steps, right, using AI capabilities to, say, accelerate building, documenting, functional specific design documents by doing data migration or coming up with test scripts and whatnot. So, those are the use cases that we are also incorporating into our future engagements as well. 

 

Bill Howell 0:10:25.4: 

I am certain there are going to be a ton of questions about all of that at the end from the audience. That's great. Let me start by asking a few questions of the business outcomes you were targeting when you started this journey. Obviously, OpenAI has been a rocket ship of a company to be a part of. But there had to be some drivers around prompting you to use workforce planning as the first set of use cases that you wanted to go after. Curious about what business outcomes or challenges you were facing, what outcomes you were looking for when you kicked off the journey at OpenAI. 

 

Jeremy Stern 0:10:55.2: 

Yes, great question. The classical answer is, again everything was in Google Sheets, everything was in Excel and the real issue was that information didn't travel as quickly as it needed to. Excel and Google Sheets and our slide decks - and even our BI tools and dashboards - really weren't cutting it and delivering the information that we needed in the time that we needed it to. So, instead of going with the classic first implementation, where you start with depth, you go very deep with one team, we thought in order for people to get used to this tool we want to spread it broadly to the organization. The tallest nail in the stump ended up being workforce planning. We were having a lot of issues with keeping up with not only what roles we were releasing, but when we were releasing them. Were levels being changed? Who had what gaps and who was way over-hiring? 

 

Jeremy Stern 0:11:52.7: 

Really, everything that jumps off of workforce planning. Size compute, people are our second-largest costs so we really had to get into the details there. So, we built out this tool in a few weeks and what it enabled us to do was on-board a good chunk of the organization. Again, as I mentioned, the budget owners and hiring managers, executives, recruiters and HR to get used to the tool, answer their questions in a pretty low-risk environment. Then also give people the visibility and see the quick wins that they could gain with a tool. So, our choice was pretty easy and where to go and what to do first, and the harder questions come later. 

 

Bill Howell 0:12:42.1: 

Yes, what's next, for sure. Well, your implementations have been blazing fast. I mean, the eight-week - six-week implementation upfront and sixteen weeks total for all the implementation work you guys have done is fairly incredible. I'm curious about, what did you have to do upfront to get prepared for such rapid implementations that you could do efficiently? 

 

Jeremy Stern 0:13:04.4: 

Yes, absolutely. I know Hank can probably either affirm or deny these things, if they helped or not, but what we tried to do was first… The biggest thing is get our data in order. Data is usually the biggest blocker that we see in getting a successful implementation done on time. So, we had all of our datasets as clean as we could make them and all of our entire integration strategy set before we got into the tool. Obviously, as the team can attest, there were some tweaks - as any project goes - but getting 80 per cent of the way there saved us probably a month, a month-and-a-half at the start. So, data was first and then we also took a first pass at writing out exactly what we wanted and trying to describe every portion of the process that we wanted to complete.  

 

Jeremy Stern 0:13:54.2: 

This may sound, for the veterans in the room, like a no-brainer but for those who have never done an implementation, you just… You know what you want, but you haven't really communicated it before. You've never lifted the rug. So, we went about writing out every piece of our operating model and what we wanted out of it, what we wanted changed and what we wanted to keep the same. Then we litigated that a few times and then passed it over to the professionals, who then refined it for us. 

 

Bill Howell 0:14:23.3: 

It sounds like professionals are doing the work upfront. That's pretty impressive work upfront to get prepared. Hank, walking into that project, already having a running start toward it, maybe you can talk about what PwC brought to the party and a little bit about what you guys did to extend on that. 

 

Hank Woo 0:14:39.2: 

Yes, so I love to use a car analogy, so what we brought was a Ferrari that could go 200 miles an hour. Clearly, there has to be a driver and skill. Everything that Jeremy said is true, so clearly the eight weeks is, you have a reaction once you see it, right? For those of you who've been around the block implementing Anaplan, it's a really tight timeline. But nothing is impossible because if your people are ready to use the tools, are ready to engage - and OpenAI team members are so enthusiastic. Then Jeremy and Tim had the data, which is really the core foundation of building any models. They had data ironed out. Jeremy also started building some data hubs as well. Then most importantly, we had Jeremy, right? Jeremy is a very skilled Anaplan model builder, solution architect, as well as part of the FP&A organization. So, Jeremy was really integral in terms of guiding us, right, building but also engaging the business and driving the key design decisions and what have you. 

 

Bill Howell 0:15:48.0: 

It makes perfect sense. It sounds like a lot of things went right in the initial implementation - which is great. I'm certain, though, along the way there were challenges and you learned a lot from those challenges when you went into the next project for P and L. So, curious about: Jeremy, what challenges did you guys experience in your first implementation that informed the second, and then going into the P and L implementation, maybe talk a little bit about how those learnings impacted it. 

 

Jeremy Stern 0:16:12.3: 

Yes, well, I'll give PwC an out because they were not part of the mistakes in the first implementation, so no mistakes on that. No, I'm kidding, but through the first implementation we realized - even though we knew it intuitively - the importance of marketing the tool, marketing the intent and marketing the benefits. PwC, throughout the second implementation - while slightly longer, no more time to get people enabled and wait until people adopted the tool - we were on a very tight timeline and had very crisp deliverables that we needed to meet by certain dates. We achieved that and really, the marketing of the tool, through PwC's help, by really being upfront at the start. We identified the people that would be using the tool immediately. We got them involved as early as requirements gathering and initial wireframing. 

 

Jeremy Stern 0:17:09.4: 

If you provide a sense of ownership from the start, then people will feel that ownership throughout the implementation. I know everybody talks about making sure you iterate under the agile methodology. But rarely do you see it in practice as aggressively as we did, where people were getting sick of us by the end of the tool and saying, 'Okay, we get it. We know how it's built. We know what I need to do. Just let me know when it's done.' We didn't want that to happen, so marketing was probably the biggest learning project over project. Then beyond that, getting our understanding of: what is a K-Bug, what's an enhancement, what is a small, quick project and what will be a larger project? We looked at everything on the same roadmap and on the same timeline. PwC really supported us in identifying, okay, how would we group the big, chunky use cases - as I mentioned earlier, the revenue planning, the compute planning, the in-depth gross margin - versus some of the quick hits. 

 

Jeremy Stern 0:18:17.6: 

You may've seen on that slide: recruiting capacity has been a request since we've finished workforce planning. We said, 'Oh, we'll get to that with the big project, we'll get to that with the big project.' PwC helped us realize, okay, we can do those in tandem and get some quick wins while leading up to the next large project. 

 

Bill Howell 0:18:35.1: 

That's great. That's fantastic actually and you were doing all this in a new product from Anaplan; Polaris. Maybe talk a little bit about that aspect of the journey, Polaris being a different technology than what you may've used in the past in previous positions. Talk about how you stood up Polaris so quickly, what benefit that's brought to your business specifically, a little bit about that journey. 

 

Jeremy Stern 0:19:01.1: 

Yes, so the immediate use case - because we knew we wanted Polaris before we started this project. As I alluded to before, the very large, granular cube was very enticing. Really, that's because a majority of your time working in a peer FP&A or strategic finance role is trying to hunt through data to find the good insights. Normally, with the big cubes, that becomes much more difficult. But since we were able to leverage our models, since we were able to leverage some of the structures and dashboarding that PwC supported us in making, that wasn't as big of an issue. Now I can confidently say one of the - if not the most popular dashboards - we have is, I think, a cube with, what is it, eight dimensions, [?Ritchie]? Something like that eight or nine dimensions, a few trillion cells. People are on there every day, multiple times a day. So, Polaris, after we realized that we could have a much denser model, we don't have to worry as much about space, we can implement quicker because we don't have to build all these custom hierarchies or custom rules. Then finally, we're able to derive insights much more quickly, both using our tools and using the views that we've built. It became very easy. 

 

Bill Howell 0:20:22.2: 

Yes, that's fantastic. Hank, anything to add from a Polaris perspective? 

 

Hank Woo 0:20:25.2: 

Yes, everything Jamie said is true about Polaris. What Polaris does for us is, if you think about traditionally EPM solutions always limit you, the dimensions. So, you have creative ways to leave some dimensions out or combine dimensions so you overcome some of the sparsity and stuff, right? What Polaris does is, it gives you… If you look at some of the marketing materials, it gives you - what - 18 quintillion cells per line item. I had to look it up. It's 18 zeroes after 18, so it gives you a lot of headroom, right? What that does is, it gives you the ability to consume data, the business data, as it is rather than having to do something about it, right? So, that's really the huge advantage of Polaris. I'm so glad that Jeremy actually did the homework, spent time and concluded that Polaris was the right solution. We spent no time going right into Polaris. It's really for users or potential customers that don't have Polaris, that haven't implemented Anaplan, probably a no-brainer, right? Go right into Polaris and then the question for the existing users is not if, but more of a when, right? When is the right time to migrate from your classic models to Polaris? 

 

Jeremy Stern 0:21:47.4: 

I do just want to add one more thing. We definitely had our doubts and our reservations when first looking at Polaris. I know we talked a lot about the pros and cons. It's really a testament to the R and D that Anaplan has put into the tool. Polaris is not the same tool it was six months ago, let alone a year ago. This idea that we had of, oh, there are a lot of limitations, all these things don't work, it's going to take forever to load and close, we really haven't experienced any of those. So, yes, while it was easier starting fresh with a brand-new model, I would highly, highly recommend if you want that - again, that big cube - to start looking at transitioning. 

 

Hank Woo 0:22:28.4: 

Yes, no more concatenations. 

 

Jeremy Stern 0:22:30.2: 

Yes. 

 

Bill Howell 0:22:32.3: 

Good advice. Let's talk a little bit about results and how you're measuring those results on the business side. I'm curious about how these initiatives have changed not only the financial outcomes for OpenAI, but also the quality of life for the people at OpenAI and finance. Then Hank, on the PwC side, how are you helping OpenAI measure that? I'd be curious about those answers, so maybe… 

 

Jeremy Stern 0:22:54.0: 

Yes, I guess I can start with the anecdotal and you can start with the quantitative. So, like any finance team, right, we were not special in the issues that we had. We were having to pull down the same datasets every week, every month, every day. Sometimes we were having to organize it, chop it, pivot it and then push it through our models. It takes a lot of coordination and it also gives a lot of opportunity for human error. What the tool and the new application that we've built allow us to do is not really think about integrations. Yes, we always have to check and we always want to make sure that everything that's coming in is right and complete. But now that we don't have to worry about organizing or pulling the data down, we have much more time to actually do the modeling. 

 

Jeremy Stern 0:23:50.7: 

But then we also automated the modeling and we set certain preferences, set certain drivers. Again, Polaris has allowed us to be a little more liberal in how we structure that. We set our drivers, we backtest, we make sure that all of our models are accurate. Then we do what really the core of the job is, which is insights and reporting out. Speaking with people and trying to figure out: how do we properly allocate our capital? So, just to go through some high-level numbers, I was chatting with a few of my teammates to figure out, how has it affected you? Obviously, I spend every day in Anaplan because we're trying to build quick improvements. I'm also trying to show my team the things that they may not want to do in there yet. But one anecdote we got from the one guy who runs our entire payroll and equity forecast, is that it saved him probably, he said four-to-five business days. 

 

Jeremy Stern 0:24:47.5: 

Not only did he have to pull down all of our payroll and equity data, but he had to have an argument with our payroll and equity people every time he did it - just to make sure he knows that it's PII. He now has that fully integrated. It runs once a month. He just grabs it, checks it, makes sure it looks good, adjusts some drivers based on our back tests. Then he's done within a few hours instead of a few days. We've seen that very similarly across multiple functions; through payroll, through core, through vendor planning and through CapEx. 

 

Bill Howell 0:25:23.4: 

Nice, great outcomes. 

 

Hank Woo 0:25:27.3: 

So, the startup type of anecdote, right, real experience enhancements and testimony is important. But Jeremy, Tim and Peter, which also did… We defined what success looks like, because success sometimes is subjective. So, if you don't define goalposts, then you don't know where you kick your ball to, right? What we did was we spent - even though it was a quick turn - spent some time defining from speed, from quality, from a cost perspective what success looks like from quantitative measurable success perspective, but also quantitative, establishing that. Then being able to follow up and anchoring back to that was a good structure to ensure we are driving that impact and value. This is where Anaplan Value Assurance plays a big role as well, right? Liaising tightly with the SI as well as the customer teams so that there was a three-legged stool all working together. 

 

Bill Howell 0:26:25.4: 

I tell you what, it's been a really great collaboration - not only preparing for this - but just seeing the work that you guys have done together. The roadmap that you've built out is super-impressive. I think PwC, you've helped inform that but Jeremy, you and your team have really identified: what are the next business outcomes we need to go target? Curious about maybe sharing a little bit about what's driving the roadmap for you. We had it on screen a little earlier. What are the next steps with Anaplan? 

 

Jeremy Stern 0:26:49.5: 

That's a good question. As we mentioned earlier, filling out the roadmap - and again PwC was very helpful in figuring out, again, what's a big project, what's a quick project and what's just a fix? But filling out the roadmap took a little while, took sourcing information, the biggest pain points from our teams and really, it ended up being: what's burned you most recently? We had to make some very quick decisions on whether we can deploy capital in certain areas. Let's say over the course of a month, we filled out probably a multi-year roadmap on what's burned us recently. First we, again, filled it out and then we figured out, what is the order? We were talking about this a little bit earlier, but we are still a very nascent company, with very nascent processes. We kind of have to wait until the models and the structure are ready.  

 

Jeremy Stern 0:27:45.4: 

We go out to each team and we say, 'What would you like automated? What would you like at this granular level that Anaplan provides, and what would you like integrated?' Everybody has a great wish list. Then we ask the important question, 'Okay, how often does your model change?' Almost everyone says, 'Monthly.' So, that's probably not the right time to get something into the system. We realized we had to figure out first, what is the change management that's going to be required? What is the analysis that's going to be required in order to get a Google Sheets model to a steady-enough state where we can then build it into a system? Now we had to add that layer of the roadmap on top as, first let's explore what you're doing in Google Sheets and see if we can't pull it up a few thousand feet just to get at least a directional level of consistency.  

 

Jeremy Stern 0:28:34.9: 

Obviously, some changes are inevitable and we are investing in having the ability to implement changes and implement structural changes. But ultimately, we need to see at least two consistent months of the same type of model in order to build it into the tool, so that's how we ordered our roadmap. 

 

Bill Howell 0:28:57.2: 

Yes, that's great, and it seems like it's methodical and deliberate and planned - which is fantastic for you guys. Hank, anything to add on the planning side for PwC? 

 

Hank Woo 0:29:06.2: 

Yes, something that comes to my mind is, it's a really connected roadmap that is highly agile. As you think about Stargate - which Jeremy mentioned - building out infrastructure in house, it introduces a whole new set of challenges. Building up model processes and infrastructure around supply chain and then clearly, given the rate of the growth that OpenAI has been seeing, there's going to be a point in time that the finance will have to upgrade the whole infrastructure. So, how that relates to planning solutions, so just building out the roadmap of being open and agile to make sure the roadmap is current. Then also looking ahead; what's coming around the corner? Jeremy mentioned it's a beginning company, but there are some extremely talented and smart people at OpenAI - including Jeremy. It's such a pleasure, right? Learning and developing and defining the future as we go. 

 

Bill Howell 0:30:06.5: 

Yes, it's a fun story. It's been fun to be a part of and I really appreciate you guys taking the time. One final question for my side and then we'll turn it over to the audience. You've mentioned people readiness and data readiness and change management and all the elements of; how do you activate this in a business? But what are some final words you might have for folks in the audience around the lessons you've learned and what they should know if they're going on this odyssey themselves? 

 

Jeremy Stern 0:30:30.2: 

Yes, I think the biggest learning that we had and it's - you see it pretty consistently throughout every project I've been a part of in my past life versus every company that I've built out these types of tools for - it's, bring as many people along for the ride as possible. The earlier you introduce people to this tool, the easier it will be to have them use it consistently. It's obviously a hard conversation to have at the start to say, 'Hey, this thing isn't ready yet but can you come take a look? It's very bare bones.' But the earlier you bring people in, the easier it will be to explain - and the easier it is for them to evangelize. 

 

Bill Howell 0:31:09.1: 

That's great. Good advice. 

 

Hank Woo 0:31:12.1: 

I'd like to add to that: vision, have your dream, dream your dreams and then find the right partner. 

 

Bill Howell 0:31:22.4: 

That's awesome. Well, thank you gentlemen. Jeremy, Hank, thank you very much for being here. We appreciate the time, thank you. 

 

Jeremy Stern 0:31:28.3:  

Thank you. 

 

Hank Woo 0:31:27.8 

Thank you. 

SPEAKERS

Jeremy Stern, Finance Manager, FP&A & Systems, OpenAI

Hank Woo, Partner, ​Finance Technology & Transformation

Bill Howell, AVP, Anaplan