Zum Hauptinhalt wechseln

We'd prefer it if you saw us at our best.

Pega.com is not optimized for Internet Explorer. For the optimal experience, please use:

Close Deprecation Notice

PegaWorld | 38:56

PegaWorld iNspire 2024: Pega Knowledge Buddy: AI-powered Answers for Everyone

What if employees could get the concise answers they need directly within the context of their work? And customers could quickly find trusted answers in self-service? What if nobody had to search through long or disconnected documentation to find an answer?

Join this session to learn how Pega GenAI Knowledge Buddy empowers employees and customers with AI-powered answers - reducing handle time and improving consistency in customer service, accelerating productivity and improving employee efficiency everywhere, and improving everyone's experience when seeking an answer.


Transcript:

- ♪ I got ♪ ♪ I got way ♪

- So before we get too far into it, I wanna sort of set us up with some history and context here in the interest of making sure that we're all on the same page, we're working from the same set of assumptions. We know the buddies that led us to this moment 'cause you know, buddies have been around for a long, long time. Some of you may not know this, but of course, in the 20th century there's Buddy Guy, Buddy Miles, Buddy Rich, very influential Buddies on 20th century music. And of course the buddy cop comedy, "The Men in Black," "Starsky and Hutch," "The Other Guys," many, many more. If any of you doubt that "Men in Black" is a buddy cop film, we will be taking questions so I'll be sure to clear that up if it comes up. And it would be remiss not to mention Buddy the Elf, everyone's favorite Buddy until now. So you saw Karim showed us this man, this poor, poor man earlier today at the keynote. I feel personally validated by this man. I feel very seen by him. I'm sure many of you too. Really, Knowledge Buddy started with this basic question, which is what if you could just ask your documents, right? I mean, I work at Pega. I started up recently and onboarding was, you know, it can be a lot to take in. You're going through all kinds of HR policies and you know where emails live and who to talk to and what kind of information you need to bring in. But also imagine if you're a customer service rep or a seller and you're talking to customers, talking to prospects. And while you're doing that, you have to rifle through thousands of pages of PDFs or policy documents or product documentation or SOPs, RACI charts, all kinds of things that, you know, you have to just have in your head. But you know, you might not have them in your head at any given moment. And of course, it's a lot of stuff out there in these enterprise organizations. It's too much to ask some person to just be an encyclopedia on the dot, right? So we decided to find a more elegant solution to this problem with the power of large language models of GenAI and of RAG to make it easy to bring those answers right to everyone's fingertips, right at a moment's notice so that you can just ask this interface, "Hey, what is this policy?" "What do I need to know about this procedure?" "Does this thing apply here?" Get that answer in plain language right away in the way that makes the most sense for, you know, the context that you're in, right? So I just mentioned RAG. Occurs to me that maybe not everybody here is fully aware of what RAG stands for. So I'm issuing all a pop quiz. We will give you a multiple choice selection so please don't blurt out the answers until they're all displayed on the screen. But is RAG Red Amber Green? Those are colors. Is it Retrieval Augmented Generation? Which sounds complicated. Is it Red Auerbach's ciGar? Celtics in four. Sorry, maybe some Texans here, sorry about that. Or is a RAG just a humble dish towel? Of course, it's Retrieval Augmented Generation, right? And this is the process that we use to optimize the output of a large language model so that it looks at your enterprise information, it looks at the documents that you want it to look at, it's the surface answers that are relevant for your enterprise instead of using, you know, the training data that a lot of LLMs use, which is just like the whole internet, right? But your enterprise isn't necessarily part of that. So RAG allows you to refine that and use Semantic search in chunking, to look at the right pieces of information while still retaining that magic of the LLM, which allows you to actually talk to the interface and get simple answers in simple language. And I thought a good way to demonstrate this idea would be to ask a couple of the LLMs that are out there, what is Knowledge Buddy? What is a Knowledge Buddy after all? And so I started out by asking ChatGPT, which we all know and love, and it gave me this answer. "Knowledge Buddy typically refers to a concept, individual groups pair up, share, exchange knowledge, skills and expertise with each other. It's a mutual learning relationship where each person acts a mentor or guide," et cetera, et cetera, et cetera. Which is a great answer. It's also not the answer that I'm looking for, even remotely and why would it be? Because ChatGPT wasn't trained on Pega's, you know, most recent GenAI product releases so why would it be the, you know, why would it know what I'm asking in this context? I also jumped over to Microsoft Copilot and asked, "What is Knowledge Buddy?" And it told me that, "it's a helpful companion that provides expert insights, tutorials and resources to enhance your learning journey." That sounds, you know, decently close. But then we get into, you know, visa and questions and immigration and scholarships and all sorts of things that are, again, not related to what I think about when I'm thinking about Knowledge Buddy in the context of Pega. All right, so this is kind of where we're going here. So I, we, actually, at Pega use Knowledge Buddy internally for people like me and Shelby and others to, you know, learn more about our product and get the latest and greatest. So I went over to our WebEx channel where I can ask our Knowledge Hub Buddy various questions about Pega. So I asked it, "What is Knowledge Buddy?" And it gave me this answer, "a client-facing product within Pega GenAI, helpful GenAI-powered chatbot with a base knowledge about Pega," and so on and so forth. And I thought, well, this is actually quite a good answer, but I want it to be a little shorter. And so I decided to go over to another Knowledge Buddy that we use internally. I didn't have to do that but I figured, you know, presenting might as well get the full, you know, effect here. So I brought up our Sales Buddy, which we use in our sales automation platform, or, excuse me, industry-leading sales automation platform. And I said, "What is Knowledge Buddy?" But this time I specified, "Please answer in one sentence." So it gave me the sentence, "Knowledge Buddy is an AI-powered conversational assistant that provides precise answers to specific questions using an organization's own content, enhancing the relevance of responses and eliminating the need for users to search through potential document matches." That's great. You'll also notice that it pulled up three reference links. And this is why, this is kind of how RAG is working in the background here. It is looking at these articles that said, "These are the articles that probably answer the question. I'm gonna pull out a summary sentence of these articles based on what you asked the way that you asked it. And if you need to know more, you can click on any of these links and you'll know more, you can share these articles, you can affirm what I just told you and make sure everything is all squared away." But one thing you might be thinking now is that we've, I've showed you two knowledge buddies here. Are they multiplying? How many are there? Is this some Hydra, you chop off one head, three more pop up? We figured that it's probably not practical to try to pull up one Knowledge Buddy that is all things for all audiences, right? Because you may have a Knowledge Buddy in surface and self-service that's talking to customers, using certain kinds of language, looking at certain documents. You may also have Knowledge Buddy that is looking at really sensitive legal policies that you're using internally for a small audience. So it might not make sense for those all to be wrapped into one single buddy. So what we've decided to do is make it really, really easy to stand up multiple buddies that are fit for purpose. And again, Pega is a low code platform, right? We believe in reuse center out, we want this to be as easy as possible to get as wide adoption as possible and efficiently as possible. So it's very simple and straightforward to do this. Shelby and Steven will show you in a little bit just how simple it is to stand up buddies that are fit for purpose like this. And really it comes back to looking at the right use cases, the right audiences. This is, you know, a sample of the audiences that we think about at Pega when we think about Knowledge Buddy, of course, this is open to your interpretation and creativity for the things that your enterprise needs. But imagine self-service where customers are in coming to your website and they're interacting with Knowledge Buddy through chat to answer questions in a really intuitive, logical way instead of like press one for yes and two for no and three for returns or something like that. People don't want to do that. They've been using ChatGPT this whole time. They want just to be able to have a conversation through self-service, right? So that's made possible through Knowledge Buddy. I talked about agents and customer service and sellers as well. Being able to pull up information while you're speaking with prospects and customers on the fly, have the right thing at your fingertips, the most recent information at your fingertips really quickly. And of course, operations, which is almost everything else. Onboarding, sure, I mentioned like Legal Buddy, I could imagine there being an IT Knowledge Buddy, which would really be very, very simple because all it would have to do is ask, "Did you try turning it off and turning it back on again?" Okay, so we're gonna get over to Shelby and Steven in just a second here. But I want to just, you know, position Knowledge Buddy as part of this broad suite of GenAI products that you guys have heard about this morning at the keynotes, at the breakout sessions, right? Knowledge Buddy is an awesome AI tool. It's part of a broad set of applications that we can make available and that we do all kinds of things from scaling expertise where you're able to guide employees as they're making decisions within a case about what the best next action is and things like this, to analysis of your case data and pulling out insights that are intuitive and easy to interact with. Or of course, automating things like writing emails or pulling up, you know, notes after calls and stuff like this, right? So making every employee your best employees the way that we position GenAI and that's really what Knowledge Buddy's trying to do in concert with the other capabilities that we have. So proper introduction here for Steven, like I said, VP of North America Sales and Service with Aaseya, platinum sponsor, they're at Booth 21 in the Innovation Hub. They're there to be, you know, a resource for all of you. So please go visit them after the session when you guys have a chance. But part of what makes them 2024 Partner of the Year is over 150 go lives all over the world. 600 low code projects, more 600 low code delivery professionals delivering projects all over the world. And they've integrated Knowledge Buddy as an early adopter. And when I said like those use cases that you saw on screen aren't everything, they found some creative and interesting ways to use Knowledge Buddy to better their product and help their customers even more. So a PegaWorld welcome to Steve. Thanks for coming with us.

- Thanks, Sean. All right, good afternoon, everybody. There's quite a bit of people here. I wasn't quite sure how many would show up considering we're on the third floor, so, and after lunch, but thank you for the intro there. So before we get started with our Knowledge Buddy integration with our solution around AIS, our Aaseya Inspection Solution. Wanna show a very brief video for everyone.

- [Presenter] Welcome to Aaseya Inspection Solution, AIS. A comprehensive cross-industry inspection platform designed to adapt quickly to changing regulations, standards and quality requirement. AIS leverages a reusable framework across various lines of businesses and inspection categories such as manufacturing, healthcare, hospitals, buildings, roads, environment and hospitality. AIS is a configurable solution that manages the entire inspection's case management lifecycle with no code. AIS also incorporates modern features like Pegasystem's Knowledge Buddy integration and Pegasystem's AI driven capabilities to enhance inspector productivity, streamline the inspection process. and enhances decision-making and improves compliance and risk management. With its adaptable AI enhanced features. AIS is not just a solution, but a revolution in the inspection process, bringing unparalleled efficiency and reliability to businesses of all sizes.

- So in terms of our inspection solution in relative to this conversation today, we want to provide a tangible example of how to integrate, use Knowledge Buddy in a way to be able to surface a lot of the hidden data that buried in various documents. And considering the nature of the inspections world around compliance, standard operating procedures and such, we've developed a solution that helps you accelerate and bring that to market very quickly. And so when we talk about no-code inspection configuration to accelerate that through our solution, you can go through and set up these inspection templates and categories and checklists, you can establish and configure the scoring methods around how you wanna score, what passes and fails, be able to monitor the whole life cycle of what's happening, configuring workflows, as well as configuring work zones based on the different type of inspections and inspectors in the region, configuring templates as well as roles and security. And so over their past, I would say month and a half or so, we've been working with Pega to be able to take Knowledge Buddy to be able to manage that whole lifecycle. And so what we're bringing to the table today, I'll demo for you guys for loop today, is that based on our inspection solution, we can configure these pieces. And in terms of speed to market, what really helps accelerate that even further, is the unique use case that we brought forward as part of the demonstration today. And so when we look at these two different use cases, one is we're calling it Configuration Buddy, we're basically, we're using regulatory documents, SOPs to ingest these knowledge content documents to be able to use that information and surface it to automatically configure our inspection solution. So reducing the necessity for people to manually go in and configure it, we could pull that content in, and be able to surface that to accelerate that configuration process. The second use case is the more traditional one that you've heard so far, is where Knowledge Buddy is more of a co-pilot, if you will, where the inspectors can actually ask questions in context of what they're going through. And so we'll demonstrate that as well. So I needed to get out of this.

- [Sean] Yeah, you seem like you...

- Okay. And so what you're seeing here is our Inspection Solution, and as you can see we have many domains of configuration of our solution to manage a whole lifecycle. But for today's conversation, I'm gonna show you what we've done with Knowledge Buddy. And should you guys have more questions, feel free to stop by our booth today and we'll do a detailed demo for you. And so when it comes to managing inspections, it starts off with getting that inspection configuration set up and so we've incorporated a new feature called AI Configuration Buddy, in which you can start that process by uploading that content document. And so I'm just gonna give an example here. Fire Safety, oops, upload a document, SOP, attach it. Okay, typing in front of a bunch of people is not very fun. Let's see, fire safety, that's for safety. And so before I click Submit, I'm gonna explain what's gonna happen. So what we've done is through the Inspection application, we're consuming and attaching these documents as to the basis of what the inspection should be. And so what happens is that once we click Submit, we're integrating with Knowledge Buddy to be able to pass that content. And then we've pre-configured some of the things that we're looking for to be able to return back those checklist categories and criteria. And so what you'll see here is that you'll see the results of that. And so you'll see that the General Safety Measures category includes the following checklist items. And so when we talk about configuration, this is the first opportunity in which when we integrate with Knowledge Buddy to bring that forward to the person who's configuring it and to review it to make sure everything's okay. And so they can go into these, each of these items here, and they can either edit them, make some changes, they can move it to a different category if that's a wrong one, they can make a copy of it to another or just remove it completely. And so giving the end user the flexibility to do so. And so what we can do here is then once the inspector configurations person sets this up, they'll click Publish. And once that's published, it's now available to be able to initiate new cases, run inspections. And so from a Configuration Buddy perspective, you know, that's a, you know how we're using the knowledge way differently than a traditional Co-Pilot use case. So I'm going to come out, log out of this user for a moment. And the second use case is the Co-Pilot, oh, sorry, it's going to go from the beginning. So the Inspection Buddy use case, which is the Co-Pilot is where once the case is initiated and they're walking through the case lifecycle, Knowledge Buddy is assisting the inspector throughout that lifecycle. And so what we'll do is demonstrate and log in as an inspector and I've teed up a few cases here already. So I'm gonna open up an inspection case that's been created and what you'll see is you'll see all those configurations that were set up as part of the configuration process brought forward and presented to help guide that inspector through that process. And so what you'll see down here in the bottom right hand corner is the inquiry function around how to ask the Buddy a question relative to the inspections. And so when you have that knowledge content inside of the Buddy, you can ask various questions that support the inspection lifecycle. And so as they go through the inspection process, they'll click Yes around things on did it pass, did it fail? They can add notes to the process and should there be a question about, you know, what type of, if something failed, what you'll see here is that we've integrated with Knowledge Buddy to be able to do a query to say, "Hey, this item failed, what is the associated corrective action?" And so if it's provided that the knowledge content that's been loaded exists, you're gonna get that answer. And so when we pose the question in a similar fashion through the Co-Pilot, I could ask for what is, what are the, let's see, smoking restrictions corrective actions. So if I ask the Buddy the question, comes back with the response saying "The corrective actions for smoking restrictions is to enforce smoking restrictions through campaigns and regular checks." And so when we look at this checklist item here for that item, if I click No, it should bring back the same answer as well and it's gonna make a liar on me. Sorry, demo. So this gives you some context as to how we're using the Knowledge Buddy to be able to query these regulatory dockets in surface and assistance inspection process without having to know all the details around what this content maintains. So those are the two primary use cases that we have I'm gonna demonstrate for you today and as we move forward into the presentation, we're gonna talk about the Buddy anyway.

- Yeah, yeah, let's start here. So obviously, the first use case you walked us through there is a little bit non-standard. You guys won't go to a booth today and see a Knowledge Buddy returning configuration results like you guys did. So you're returning some JSON based on those SOPs that you're uploading.

- Right.

- Can you tell us a little bit about how long it took you to set up one or both of these buddies and what you learned during that process?

- Yeah, so the actual Buddy setup up process didn't take very long. I would say it took, you know, just a couple days to go, go through the setup process and to understand some of the mechanics of how Knowledge Buddy works. But once we started working through that and working with Pega, the Pega product team, they helped us refine what these queries are and to be able to make sure we shape these things properly as well as configure from a technical perspective, they're on a chunking size and the match, I think, what was the match scoring I think?

- Yep, that's right.

- Around what's the optimal thing to do. And so in terms of getting that up and running, that was relatively straightforward. Now the nice part of where you're seeing here is that these two Knowledge Buddies are two different ones, but the foundation for it is the same. The difference is how you're interacting with it. And so from a security and roles perspective, when you have a Knowledge Buddy set up, you can manage, you can access that interactive portion of it. But from a system to system perspective, we can manage that as well. And so when we lead the integration between our inspection solution and Knowledge Buddy, getting that, posing the same question against the same content, but returning it back in the JSON format where it's consumable. And so that's one way of some of the learnings that we had to be able to, you know, configure those things. And quite frankly, it gives us an opportunity to even integrate further and automate more as well moving forward.

- And you had originally gone down the path of building that yourself, right?

- Yeah.

- You were looking at ChatGPT or just an LLM. What were some of the capabilities that Knowledge Buddy brought to the table that made you decide to do this with Knowledge Buddy and just not on your own?

- Yeah, I think the simple answer is there's a lot of heavy lifting that has to happen, right? In terms of going off doing your own thing and considering, you know, we're a Pega partner and we embrace the new tech technology and such, you know, what, of all the functionality that is built in the Knowledge Buddy already, there's no reason to go try to build it yourself. And there's so many features inside of it that I can't get into every detail of it, but when you can do auditing of what every single question that was posed, see what the response was, see the basis of the response, those types of things are invaluable in terms of able to have an inventory of what those things look like, not just for auditing purposes, but also for performance purposes as well as the accuracy over time.

- And we use that a lot when we were fine tuning your Knowledge Buddies, like you talked about before. So if we got a response that we didn't like, right, it was slightly off or maybe it was missing one of the bullets from the standard operating procedure, we could go open that ask case.

- Yeah.

- See the results of the semantic query, see the prompt that was used at that time and the team used that as a tool. So not only was it historical, but it was actually a tool to help refine the Knowledge Buddy and make it produce the answers that we wanted.

- Yeah.

- Yeah.

- So what's next? I know when I talk to clients, Knowledge Buddy is a great kind of first step into that GenAI world if you're just getting used to it. But obviously, we're spending a lot of time here this week, talking about all the GenAI capabilities that Pega brings to the table. What's next for you guys?

- So I think what's next for us is obviously, we have our Inspection Solution, which is brand new to bringing to market this year. And we're definitely going to expand into, you know, automating a lot more things. And as we look at potential working with potential customers, whether it's for inspections or non-inspections, we can help them really assist them to how to kind of curate the thinking around how do you wanna leverage the Buddy because right now it's pretty wide open and I think everyone to a certain degree has an idea of what can it do, but I think focusing on what you're trying to accomplish and to be able to look at the business value associated to it, you know, we're gonna obviously continue with that journey on the inspection side, again, to take these difficult regulatory documents and use that as a true source for either configuring our solution or configuring some other solutions that we help build within the Pega platform. So that's one huge area for us. And then also doing a lot more automation around that process. And so some of the things we demonstrated are a little more manual intensive, but again, depending on the use case of what you're trying to accomplish, we feel that we have the confidence in working with the Pega team around how we can actually automate that in a very efficient manner and reducing the manual interactions.

- Okay.

- And so that's one area. The other is on the chat side of things where we're surfacing the information and whether it's a call center trying to, from a self-service perspective, I think there's huge opportunities there as well, yeah.

- So it looks like we've got about three minutes and 30 seconds left. Does anyone have any questions for the Pega team or for Steven?

- And if you do, just step up, use one of the microphones.

- Yeah, we got some mics. And you know, introduce yourself and go ahead and ask away

- And before you ask the question, if it's very technical, just stop by the booth. I'm not the right person to give you the answer. So if you wanna sit down that's fine.

- [Abdul] My name is Abdul, I'm a IT principal director in Cigna. So if you see our industry like the data and the security is one of the critical thing like a PHI Peer. So in this one, like if you, if anybody goes and asks Knowledge Buddy who was the inspector for this one, does your Knowledge Buddy have any filters to really restrict any sensitive information? How do you handle that?

- It's a great question.

- Is that me?

- Not sure who that is.

- Yeah.

- Yeah, there we go. Yeah, no it's a great question. So there's a couple different layers to that as well, right? So you've got your Knowledge Buddy, which you're able to secure with our back controls. You can put security around that like you could any other feature in an application. And then you've got the underlying data or what we call a data source. So if you've got standard operating procedures that your internal people should access, but not necessarily your external customers, you can secure those two data sources differently. And then you can also instruct the prompt. So that's something that people often overlook is the application itself has built-in security, but you can also tell the prompt, "You are a Buddy that is meant to interface with my customers. Never give an answer about X, Y, and Z. If you're asked a question that you don't know the answer to, respond with, 'I don't know, call into this number.'" So it's a combination of your typical application-level security, both at the Buddy level and at the data source level. And then, you know, instructing the LLM to give you the responses that you deem appropriate for the user that's interacting with the Buddy.

- [Abdul] So does the Buddy needs to know the role, like what, who's asking this question at all?

- So absolutely. So if it's an external-facing Buddy, Buddy ships with a Knowledge Buddy public user's role that just ships out of the box. So that's what you would typically use for like your customers. And then there's also, you know, authors, data source managers, there's agents. So typical kind of role-based security you would see in Pega, you apply for Buddy as well.

- [Abdul] That's exactly. Thank you.

- Yeah. Any other questions? Looks like we got one.

- [Participant] So in this case, how are you, what measures you are taking care in setting up the vector database or fine-tuning this?

- Yeah.

- Do you have any experience on that yet?

- Yeah, A lot of fine-tuning points and Steven and I went through a lot of this as we implemented these Buddies. First, it's getting comfortable and getting educated about chunking as you bring data into the system. So I think the chunk size that we ultimately landed on for these documents was like 1086 and we found that that was the most optimal to get the data kind of aggregated together in a way that made sense and gave us the responses we liked. Beyond that prompt engineering is just becoming such a critical skill for anyone right now. So the fact that we are isolating the data that the LLM has access to answer from is step one to answer your question, right? It's only answering from that corpus of data that we're providing it in question time. And then step two is getting really good at instructing the LLM so that when it doesn't know the answer, it doesn't hallucinate, it doesn't make things up, but it instead gives you a response like, "Call our customer service agent" or "I don't know," which is just as powerful as all the answers you're getting by the way. I don't know in this sense is is like gold to us because it does two things. It prevents, you know, misinformation or that hallucination that you're talking about. And also, we're gonna create a case in the Knowledge Buddy application that says, "We didn't know the answer to this question." That means either Knowledge Buddy did its job and filter that out or it means we have a gap in our content that we need to go fill and actually add that to the vector database so that next time it does know. So it's a combination of providing Knowledge Buddy with the right data and getting really good at prompt engineering. It's a skill that, you know, is becoming more and more important.

- Yeah, I would add one more thing to that. I mean beyond the technical aspects of it, I think you really need to take in consideration the scope of what your Buddy is. And so as I think earlier on the presentation, the Buddy can't be everything to everyone, right? And so there's a mindfulness I think in terms of what you're trying to accomplish and set up different buddies based on different situations.

- That's right.

- And so-

- [Participant] Another key challenges we have is we have huge documents sitting in our bank and as the gentleman talked about the security measure, right? We don't want to share our PII data outside the world to LLM because those are restrictive PII, confidential PII.

- Sure.

- [Participant] So how does the data get shared?

- Yeah.

- What are the regulations compliance that we need to take care when we deal with that Knowledge Buddy?

- Yeah.

- That's one of the stumbling blockers for us to move forward. We did, we kind of doing the POC to see what are the limitations and what are the alternative solutions for those limitations.

- Sure, in most cases, Knowledge Buddy isn't looking at PII, right? It's looking at your internal documents. That's not to say that there aren't use cases that do use PII. So today, in the current version of Knowledge Buddy, we're sending that out to OpenAI on Azure and you have a couple different levels of security there. One is you can mask PII when it's going out in transit so you can strip that out, it doesn't get sent over the wire at all and then when it comes back, we reassemble it based on that kind of masking. The other part I'd like to point out is, you know, just in theory, RAG does not constitute the training of an LLM based on the data you're giving it. So that is very transactional. It's going out to OpenAI, OpenAI is giving us the summary and the response, but there's no record of any of the data that's been passed out in Azure at all. It's just transitory. It comes back and it only lives within the Pega ecosystem. So combination of stripping it out, I think you saw the announcement this morning, we're opening up support for other LLMs. I think there's a strong possibility at some point for something like a BYO or integrations to maybe even a Pega-supported LLM. So I think all of that, those are great questions for the product team if you stop by their booth in terms of, you know, what does the future look like but there is plenty opportunity today for you to keep that data secured.

- [Participant] Got it. And final question. So did you test your LLM model outside the scope? This is outside scope of Pega about the recall score or F1 score, how model is matured enough to handle for the new data types or different document types, PDF, Excel, deck, PPT, any, so does, it has any limitations and how did you go through the training data set versus test dataset while deploying this solution? Did you do, do you have any measures of successes on testing and what is the confidence the output that has this code?

- Yeah, for us it, we're still early stages.

- Okay.

- And so I don't have an answer for that in terms of that what you're asking for there. But I think we're all on this journey together in terms of understanding the capabilities of what Pega can do, what are the use cases that you're looking to accomplish, especially the security scenario you're talking about and to be able to implement something in a way that meets it objectives. And so we don't have any, again, this is relatively new. We finished this about two weeks prior to PegaWorld. So, and we got access not too long ago.

- Sure.

- But let's get together. We should get together in the Innovation Hub.

- Sure. Thank you.

- Because from a product perspective a lot of that testing and comparison was done. Absolutely, yeah.

- That's one of the questions from all the leaders that what's the roadmap and how do we use that from client perspective and scale it, yeah. Thank you, appreciate it.

- Sure. Thanks for your questions.

- I think we can take one here and then we'll probably have to wrap.

- [Participant] Does Knowledge Buddy integrate with Pega Agile Studio so developers and staff can query about existing requirements and potential requirements for the applications?

- So I think what's interesting about Knowledge Buddy and Pega, in general, is that Knowledge Buddy is headless by nature, right? You access Knowledge Buddy via the Questions API. So if you're using it in customer service or platform that's all through the widget, all of that's been taken care of for you, but if you have a different use case or application, as long as that application can make an API call, you can interface with Knowledge Buddy. So-

- So you can't easily like query user stories using it or anything like that from Agile Studio?

- So if those user stories were loaded in to the Knowledge Buddy vector database,

- So you have to-

- You could, yeah.

- Okay.

- That's not an out of the box feature today, but I think it's a great use case and we could, you know, create a cycle to suck those in and then you could ask Knowledge Buddy. Absolutely, yeah.

- Thanks.

- Great question.

- Right, I think that'll do it.

- Yeah.

- One more? Wait, oh, wait.

- All right, one more.

- [Participant] Question, of course, some of the questions here has already covered. So the source of this, for the Knowledge Buddy is going to be your Postgres database, right? It's a vector database? Or is it, if I have already existing database, can I go ahead and source it from that?

- So we're not using Postgres anymore. We were pre-GA.

- Yeah, initially-

- So, we must have had conversations before a long time ago.

- Yeah. I was part of Pega Insight.

- Oh, okay. So I don't wanna leave you a cliffhanger here, but that exact scenario that you just talked about, so you've got an, you've got a Vector database that you guys have already populated with the data. You don't wanna use ours, but you want Knowledge Buddy to be able to use that.

- Correct. That's what your question is, right?

- Yes.

- That is something that I'm talking to the product team about daily right now. So I don't know how much I can give away about what that's gonna look like, but I will say it's definitely on the roadmap for an upcoming release.

- Okay, okay.

- Yeah, yeah.

- [Participant] And other thing is that, of course, similar to that, I have data warehousing right now, earlier, and now we have replaced that with the Trino as a distributed query engine. And now what I can do is that instead of connecting to single data source, I should be able to connect to multiple data source across my organization and I should be able to have my data ready, dataset ready. Now if I want to use my data, my Knowledge Buddy to connect to that, that would be a great success instead of like, setting up the entire data source for this only, for specific purpose. Instead of that I would like to go and interact with my existing or like some distributed query engine so that I have real time data.

- I will say today, so you can see on the screen here, we've got additional context data and today in 24.1, in runtime, you can pass JSON data into Knowledge Buddy and it can use that in the context for the response. So-

- Yeah.

- And you know, like I said, I think product team's definitely looking into that external vector DB option but today, we can issue that semantic query against the vector database, but we can also accept JSON in runtime, which is a good interim solution for what you're talking about, I think.

- [Participant] Yeah, that makes sense, thank you.

- Thank you.

- Appreciate you.

- Yeah. Yeah.

- [Participant] So in case if we have to do a POC on this one, should we use OpenAI API key or Pega provides a API key-

- We provide it, yeah. It's just provided with the product, yes.

- [Participant] Okay, if we want to use our own API key, is it possible to do with that? Because with the initial days when Pega came with AI we can use our own open API key but-

- So today, we're providing all of those services through our OpenAI gateway.

- Okay.

- Or for our GenAI gateway, I should say. So you don't have to bring keys, it's baked into the product and-

- [Participant] So if we have, if, I mean, my question is, is there a possibility we can use our API key?

- Not today, no.

- Okay, okay, thank you.

- Yeah.

- There's a lot to be learned at the booths in the Innovation Hub so, more questions, find these two, find the Aaseya booth, find knowledge about Knowledge Buddy. But yeah, that's it for today, thanks for joining.

- Thanks everyone.

- Thank you. ♪ I got way ♪

Weiteres Informationsmaterial

Produkt

App-Design völlig neu gedacht

Optimieren Sie mit Pega GenAI Blueprint™ blitzschnell Ihr Workflow-Design. Legen Sie Ihre Vision fest und erleben Sie, wie Ihr Workflow umgehend erstellt wird.

Weiterempfehlen Share via X Share via LinkedIn Copying...