Networking in OCI

Networking in OCI

When you work with Oracle Cloud Infrastructure, one of the first steps is to set up a virtual cloud network (VCN) for your cloud resources. In this episode, Lois Houston and Nikita Abraham, along with special guest Rohit Rahi, discuss Oracle’s Virtual Cloud Network, VCN routing, and security.

Oracle MyLearn: https://mylearn.oracle.com/
Oracle University Learning Community: https://education.oracle.com/ou-community

Twitter: https://twitter.com/Oracle_Edu
LinkedIn: https://www.linkedin.com/showcase/oracle-university/

Special thanks to Arijit Ghosh, Kiran BR, Rashmi Panda, David Wright, the OU Podcast Team, and the OU Studio Team for helping us create this episode.

Jaksot(132)

GoldenGate 23ai: Terminology & Architecture

GoldenGate 23ai: Terminology & Architecture

In this episode, Lois Houston and Nikita Abraham, along with Nick Wagner, focus on GoldenGate’s terminology and architectural evolution.   Nick defines source and target systems, which are crucial for data replication, and then moves on to explain the data extraction and replication processes.   He also talks about the new microservices architecture, which replaces the classic architecture, offering benefits like simplified management, enhanced security, and a user-friendly interface. Nick highlights how this architecture facilitates easy upgrades and provides a streamlined experience for administrators.   Oracle GoldenGate 23ai: Fundamentals: https://mylearn.oracle.com/ou/course/oracle-goldengate-23ai-fundamentals/145884/237273 Oracle University Learning Community: https://education.oracle.com/ou-community LinkedIn: https://www.linkedin.com/showcase/oracle-university/ X: https://x.com/Oracle_Edu   Special thanks to Arijit Ghosh, David Wright, Kris-Ann Nansen, Radhika Banka, and the OU Studio Team for helping us create this episode.   ---------------------------------------------------------------   Episode Transcript:   00:00 Welcome to the Oracle University Podcast, the first stop on your cloud journey. During this series of informative podcasts, we’ll bring you foundational training on the most popular Oracle technologies. Let’s get started! 00:25 Nikita: Welcome to the Oracle University Podcast! I’m Nikita Abraham, Team Lead of Editorial Services with Oracle University, and with me is Lois Houston: Director of Innovation Programs. Lois: Hi there! Thanks for joining us again as we make our way through Oracle GoldenGate 23ai. Last week, we discussed all the new features introduced in 23ai and today, we’ll move on to the terminology, the different processes and what they do, and the architecture of the product at a high level. 00:56 Nikita: Back with us is Nick Wagner, Senior Director of Product Management for Oracle GoldenGate. Hi Nick! Let’s get into some of the terminology. What do we actually call stuff in GoldenGate? Nick: Within GoldenGate, we have our source systems and our target systems. The source is where we're going to be capturing data from, the targets, where we're going to be applying data into. And when we start talking about things like active-active or setting up GoldenGate for high availability, where your source can also be your target, it does become a little bit more complex. And so in some of those cases, we might refer to things as East and West, or America and Europe, or different versions of that. We also have a couple of different things within the product itself. We have what we call our Extract and our Replicat. The Extract is going to be the process that pulls the data out of the database, our capture technology. Our Replicat’s going to be the one that applies the data into the target system, or you can also look at it as a push technology. We have what we call our Distribution Path. Our Distribution Path is going to be how we're sending the data across the network. A lot of times when customers run GoldenGate, they don't have the luxury of just having a single server of GoldenGate that can pull data from one database and push data into another one. They need to set up multiple hops of that data. And so in that case, we would use what we call a Distribution Path to send that data from one system to the next. We also have what we call a Target Initiated Path. It's kind of a subset of your Distribution Path, but it allows you to communicate from a less secure environment into a more secure environment.  02:33 Lois: Nick, what about parameter names. I’ve seen them in uppercase…title case…does that matter?  Nick: GoldenGate has a lot of parameters. This is something you'll see all over the place within GoldenGate itself.  These parameters are in your Extract and Replicat parameter files during your distribution path parameter files. Parameters for GoldenGate are case insensitive.  Within your own environments, you can set it up to have lowercase, mixed case, whatever you want, but just be aware that they are case insensitive. GoldenGate doesn't care, it's just for readability. And then we also have something called trail files. Trail files is where GoldenGate stores all the data before we're able to apply it into that target system. Think about it as our queuing mechanism, and we're queuing everything outside the database so that we're not overloading those database environments. And that's some of the terminology for the product itself. We also have microservices within GoldenGate. 03:31 Nikita: And at the heart of everything is the Service Manager, right? Talk to us about what it is and what it does. Nick: The service manager is responsible for making sure that everything else is up and running. If you are familiar with GoldenGate classic architecture, this is kind of similar to a GoldenGate manager where that process was there to make sure that processes were running the trail files, or excuse me, that certain error logs were getting written out. If a process went down, the manager would restart that process. The service manager is performing a lot of those same functions. Now attached to the service manager, we have our configuration service. This is new in GoldenGate 23ai. This configuration service is going to allow you to set up GoldenGate for highly available environments. So you can build HA into GoldenGate itself using the configuration service. 04:22 Lois: And what does this configuration service do? Nick: This configuration service essentially moves the checkpoint files that used to be on disk into a database so that everything can be stored inside of a database.  Also attached to the service manager, we have the performance metric service. This is a service that is going to be gathering all the performance metrics of GoldenGate. So it's going to tell you how fast things are going, what the latencies are, how many bytes per second we're reading from, the transaction logs or writing to our trail files. How quickly a distribution path is sending data across a network. If you want to know any of your lag information, you'll get it from the performance metrics server. We also have the receiver service and the distribution service. These two work hand in hand to establish network communication between two GoldenGate environments. So on what we call our source system, we have a distribution service that's going to send the data to our target system. On the target system, a receiver service is going to receive that data and then rewrite the trail files. We also have the administration service that's responsible for authentication and authorization of the users, as well as making sure that people have access to the right information. 05:33 Nikita: Ok. Moving on the deployment, how is GoldenGate actually deployed, Nick? Nick: GoldenGate is kinda nice. So the way that the product is installed is you install the GoldenGate environment and that's what we call our service manager deployment under a specific GoldenGate home. So the software binaries themselves get installed under a home, we'll say U01/OGG23AI. Now once I've installed GoldenGate once, that's my OGG home. I can now have any number of service managers and deployments tied to that same home.  06:11 Lois: Ok, let’s work with an example to make this simpler. Let’s say I've got a service manager that's going be responsible for three different deployments: Accounting, Finance, and Sales. Nick: Each of these deployments is going to reside in its own directory. Each of these deployments is going to have its own set of microservices. And so this also means that each of these deployments can have their own set of users. So the people that access the GoldenGate accounting deployment can be different than the ones that access the sales deployment. This means with this distribution of roles that I can have somebody come in and administer the sales database, but they wouldn't have any information or any access to accounting or finance. And this is very important, it allows you to really pull that information apart and separate it. Each of these environments also has their own set of parameter files, Extract process, Replicat process, distribution services, and everything. So it's a very nice way of splitting things up, but all having them tied to the same GoldenGate home system. And this home is very important. So I can take a deployment, let's say my finance deployment, and if I want to move it to a new GoldenGate home and that GoldenGate home is a different version, like let's say that my original home is 23.4, my new GoldenGate home is 23.7, I simply stop that GoldenGate deployment. I stopped at a finance deployment. I changed its OGG home from 23.4 to 23.7. I restart the deployment, that deployment is automatically upgraded to the new environment and attached to the new system. So it makes upgrading very, very simple, very easy, very elegant. 07:53 Nikita: Ok. So, we’ve spoken about the services…some of the terminology. Let’s get into the architecture next. Nick: So when we talk about the architecture for GoldenGate, we used to have two different architectures. We had a classic architecture and a microservices architecture. Classic architecture was something that's been around since the very beginning of GoldenGate in the late '90s. We announced that, that architecture was deprecated in 19c. And Oracle deprecated means that feature is no longer going to be enhanced and it'll be patched selectively. And at some point in the future, it'll be entirely desupported. Well, GoldenGate 23ai is that future. And so in 23ai, the classic architecture is desupported, that means that it's no longer in the build at all. And so it's just microservices architecture. 08:41 Lois: Is there a tool to assist with this migration?  Nick: We do have a migration utility that will convert an old classic architecture into the new microservices architecture. But there is quite a bit of learning curve to the new microservices architecture. So it's important that we go through how it works in the changes. 09:04 Are you looking to optimize your implementation strategies and improve efficiency? We have a solution for you! Our new Oracle Fusion Cloud Applications Foundations training and certification program. You’ll learn to leverage Oracle Modern Best Practice (OMBP) to re-imagine business processes using advanced technologies in Oracle Fusion Cloud Applications such as AI, mobile, analytics, and more. Visit mylearn.oracle.com to get started today.  09:37 Nikita: Welcome back! Nick, what are the benefits of this microservices architecture?  Nick: It's got that simplified lifecycle for patching and upgrading. A lot of the GoldenGate patches that you get, especially these bundle patches, are complete installs as well. So you can go into My Oracle Support and download a complete install of a patch and that way, you don't have to use old patch to apply them. The only time you'll be using old patch is for one-off patches or smaller patches that need to be applied to your GoldenGate system. The microservices product has the same trusted Capture and Apply process that Classic did. There's almost no changes between the two except on how they communicate with their parent processes. And so the same logic that you use to pull data from Oracle or to apply data into Oracle is all the same. 10:25 Lois: And has the interface been upgraded as well? Nick: We've added a really nice, easy to use web interface for the microservices version of GoldenGate. Not only is this web interface work with all your standard browsers, but it's also mobile friendly too. So I can actually control and administer GoldenGate right through my mobile device. It also has new secure remote administration. This is something that the classic architecture was really missing. And so in the classic architecture, to use the command line interface, you had to log into the database server where GoldenGate was installed. Now, the command line interface, as well as the web interface and the REST API, all use remote administration and authentication. So that means that I can install the new command line interface or what we call admin client on my laptop locally and I can connect to any GoldenGate deployment as long as I have the username and password for that deployment. It's also more secure. GoldenGate microservices can also be deployed on premise or in OCI as a service and now also on these third-party clouds like Azure and Google Cloud. And it's also easier for developers to integrate in with the APIs themselves. Everything that GoldenGate does through the admin client as well as the web UI can all be traced. The REST API calls for GoldenGate are all fully published so you can get them right directly from the documentation, you can build your own web interface if you want to. So it makes it very easy. The REST APIs are also streamlined. With a single REST API call, I can do something like add an Extract process, create it, set up my parameter file, and set up the trail files all with a single API command. Whereas in the past, it would require multiple command line interface commands to do that same thing. So it's extremely elegant, very advanced. 12:16 Nikita: What does the microservices architecture look like? I know it’s a bit complicated when we’re not actually looking at a diagram of it, but just a high level, can you explain the different parts of it? Nick: It's pretty straightforward. But essentially what you've got on each system is a service manager. That service manager is then going to have a number of processes or services beneath it. It'll have the configuration service that stores the checkpoint information for GoldenGate. It'll have the administrative service for the authentication and users, the distribution service to send the data across a network, a receiver service to receive that information, performance metrics to get the performance statistics out of GoldenGate. And then of course, you also have your Extracts and Replicats that capture and apply technology. Each of those Extracts and Replicats will then connect to a database on the Extract side of things. That Extract is going to write to trail files. Those trail files are then going to be sent across the network where they're rebuilt on the target system and the Replicat’s going to consume them and apply them into the target database. So the Replicat behaves almost like an end user. So it's taking that trail file data and simply converting it to DML operations, insert, update, delete, or a DDL operation in the case of Oracle, alter table, create table, et cetera, to go into that target database.   13:39 Lois: To look at a diagram of this architecture and learn about it in more detail, check out the Oracle GoldenGate 23ai Fundamentals course on mylearn.oracle.com. So, Nick, if I’m looking to deploy GoldenGate, what should I primarily keep in mind? Nick: So as you go to install GoldenGate and you look at a deployment, there's a couple of important environment variables that you want to make sure you're aware of. So one of the first ones is your OGG_Home. This environment variable is extremely important. This is the location of the GoldenGate software itself. And I want to stress how important it is to always use version numbers when you're setting up your GoldenGate home. When you go to install the software, if you're installing GoldenGate 23.5, use 23.5 within the home directory structure. If you're installing GoldenGate 23.7, use 23.7 inside that directory structure. 14:33 Nikita: Right… that way I’ll always know which versions are which, and it’ll make it really easy to upgrade and move from one version to the next. Ok, got it. What else, Nick? Nick: There's a couple other important directories. You have your OGG_ETC_HOME. This is where things like the configuration files are going to reside, parameter files, all your certificates for security, including the wallets where we store the credentials for not only the database accounts, but also for the GoldenGate user accounts as well. We have our GoldenGate variable home directory or VAR home. This is where all the GoldenGate log files are residing. And these are the log files that allow you to see what's going on in GoldenGate for auditing purposes. Anytime anybody makes a change to GoldenGate, you're going to see information go into the log files on what was happening and how it was working and what they did, what time they did, what command they issued. Another big important feature about these log files is it also gives you error information and troubleshooting details. So if you ever need to find out what happened in GoldenGate, what went wrong, you would look at these log files to find out that information. And then you also have your OGG_DATA_HOME. This is where those trail files are going to go. Essentially, this is kind of the queuing or overflow for GoldenGate. There's a couple of other additional components. We've got the admin client. This is our command line utility. If you don't want to use a web browser or prefer a command line utility, you can use the admin client. The admin client is also fully scriptable. So if you wanted to write scripts that would go off and automate things in GoldenGate, you can do that. A lot of customers did that with GGSCI in the classic architecture. You can do the same thing now with the admin client. The other component is the microservices security authentication and authorization services. These handle communication security, especially making sure that any passwords or usernames and everything like that is all encrypted. And instead of using an actual username and password, everything through the product is going to be done through an alias. And then it also handles all the authorization authentication, permissions, user accountability, and roles within GoldenGate. 16:39 Lois: Anything else you’d like to talk about before we wrap up for today, Nick? Nick: I also wanted to take a minute to talk about the REST API. All the microservices provide REST APIs to administer them and all of these are fully documented. They can be used by any client that can make REST API calls. So if you wanted to use Python, cURL, a web browser, you can do that as well. They're all just HTTP or HTTPS calls, get, put, patch, the standard REST API standards. And then GoldenGate does provide our admin client as well as a WebUI that use these REST APIs under the covers if you ever wanted to get a more advanced look at how it works. 17:18 Nikita: Well, that’s all the time we have for today. Thanks for joining us, Nick.  Lois: Yes, thanks Nick. We look forward to having you back next week to talk with us about security strategies and data recovery. Nikita: And if you want to learn more about the topics we discussed today, head over to mylearn.oracle.com and take a look at the Oracle GoldenGate 23ai Fundamentals course. Until next time, this is Nikita Abraham… Lois: And Lois Houston, signing off!  17:43 That’s all for this episode of the Oracle University Podcast. If you enjoyed listening, please click Subscribe to get all the latest episodes. We’d also love it if you would take a moment to rate and review us on your podcast app. See you again on the next episode of the Oracle University Podcast.

13 Touko 18min

Oracle GoldenGate 23ai: New Features & Product Family

Oracle GoldenGate 23ai: New Features & Product Family

In this episode, Lois Houston and Nikita Abraham continue their deep dive into Oracle GoldenGate 23ai, focusing on its evolution and the extensive features it offers. They are joined once again by Nick Wagner, who provides valuable insights into the product's journey.   Nick talks about the various iterations of Oracle GoldenGate, highlighting the significant advancements from version 12c to the latest 23ai release. The discussion then shifts to the extensive new features in 23ai, including AI-related capabilities, UI enhancements, and database function integration.   Oracle GoldenGate 23ai: Fundamentals: https://mylearn.oracle.com/ou/course/oracle-goldengate-23ai-fundamentals/145884/237273 Oracle University Learning Community: https://education.oracle.com/ou-community LinkedIn: https://www.linkedin.com/showcase/oracle-university/ X: https://x.com/Oracle_Edu   Special thanks to Arijit Ghosh, David Wright, Kris-Ann Nansen, Radhika Banka, and the OU Studio Team for helping us create this episode.   -----------------------------------------------------------------   Episode Transcript: 00:00 Welcome to the Oracle University Podcast, the first stop on your cloud journey. During this series of informative podcasts, we’ll bring you foundational training on the most popular Oracle technologies. Let’s get started! 00:25 Lois: Hello and welcome to the Oracle University Podcast! I’m Lois Houston, Director of Innovation Programs with Oracle University, and with me is Nikita Abraham, Team Lead: Editorial Services.  Nikita: Hi everyone! Last week, we introduced Oracle GoldenGate and its capabilities, and also spoke about GoldenGate 23ai. In today’s episode, we’ll talk about the various iterations of Oracle GoldenGate since its inception. And we’ll also take a look at some new features and the Oracle GoldenGate product family. 00:57 Lois: And we have Nick Wagner back with us. Nick is a Senior Director of Product Management for GoldenGate at Oracle. Hi Nick! I think the last time we had an Oracle University course was when Oracle GoldenGate 12c was out. I’m sure there’s been a lot of advancements since then. Can you walk us through those? Nick: GoldenGate 12.3 introduced the microservices architecture. GoldenGate 18c introduced support for Oracle Autonomous Data Warehouse and Autonomous Transaction Processing Databases. In GoldenGate 19c, we added the ability to do cross endian remote capture for Oracle, making it easier to set up the GoldenGate OCI service to capture from environments like Solaris, Spark, and HP-UX and replicate into the Cloud. Also, GoldenGate 19c introduced a simpler process for upgrades and installation of GoldenGate where we released something called a unified build. This means that when you install GoldenGate for a particular database, you don't need to worry about the database version when you install GoldenGate. Prior to this, you would have to install a version-specific and database-specific version of GoldenGate. So this really simplified that whole process. In GoldenGate 23ai, which is where we are now, this really is a huge release.  02:16 Nikita: Yeah, we covered some of the distributed AI features and high availability environments in our last episode. But can you give us an overview of everything that’s in the 23ai release? I know there’s a lot to get into but maybe you could highlight just the major ones? Nick: Within the AI and streaming environments, we've got interoperability for database vector types, heterogeneous capture and apply as well. Again, this is not just replication between Oracle-to-Oracle vector or Postgres to Postgres vector, it is heterogeneous just like the rest of GoldenGate. The entire UI has been redesigned and optimized for high speed. And so we have a lot of customers that have dozens and dozens of extracts and replicats and processes running and it was taking a long time for the UI to refresh those and to show what's going on within those systems. So the UI has been optimized to be able to handle those environments much better. We now have the ability to call database functions directly from call map. And so when you do transformation with GoldenGate, we have about 50 or 60 built-in transformation routines for string conversion, arithmetic operation, date manipulation. But we never had the ability to directly call a database function. 03:28 Lois: And now we do? Nick: So now you can actually call that database function, database stored procedure, database package, return a value and that can be used for transformation within GoldenGate. We have integration with identity providers, being able to use token-based authentication and integrate in with things like Azure Active Directory and your other single sign-on for the GoldenGate product itself. Within Oracle 23ai, there's a number of new features. One of those cool features is something called lock-free reservation columns. So this allows you to have a row, a single row within a table and you can identify a column within that row that's like an inventory column. And you can have multiple different users and multiple different transactions all updating that column within that same exact row at that same time. So you no longer have row-level locking for these reservation columns. And it allows you to do things like shopping carts very easily. If I have 500 widgets to sell, I'm going to let any number of transactions come in and subtract from that inventory column. And then once it gets below a certain point, then I'll start enforcing that row-level locking. 04:43 Lois: That’s really cool… Nick: The one key thing that I wanted to mention here is that because of the way that the lock-free reservations work, you can have multiple transactions open on the same row. This is only supported for Oracle to Oracle. You need to have that same lock-free reservation data type and availability on that target system if GoldenGate is going to replicate into it. 05:05 Nikita: Are there any new features related to the diagnosability and observability of GoldenGate?  Nick: We've improved the AWR reports in Oracle 23ai. There's now seven sections that are specific to Oracle GoldenGate to allow you to really go in and see exactly what the GoldenGate processes are doing and how they're behaving inside the database itself. And there's a Replication Performance Advisor package inside that database, and that's been integrated into the Web UI as well. So now you can actually get information out of the replication advisor package in Oracle directly from the UI without having to log into the database and try to run any database procedures to get it. We've also added the ability to support a per-PDB Extract.  So in the past, when GoldenGate would run on a multitenant database, a multitenant database in Oracle, all the redo data from any pluggable database gets sent to that one redo stream. And so you would have to configure GoldenGate at the container or root level and it would be able to access anything at any PDB. Now, there's better security and better performance by doing what we call per-PDB Extract. And this means that for a single pluggable database, I can have an extract that runs at that database level that's going to capture information just from that pluggable database. 06:22 Lois And what about non-Oracle environments, Nick? Nick: We've also enhanced the non-Oracle environments as well. For example, in Postgres, we've added support for precise instantiation using Postgres snapshots. This eliminates the need to handle collisions when you're doing Postgres to Postgres replication and initial instantiation. On the GoldenGate for big data side, we've renamed that product more aptly to distributed applications in analytics, which is really what it does, and we've added a whole bunch of new features here too. The ability to move data into Databricks, doing Google Pub/Sub delivery. We now have support for XAG within the GoldenGate for distributed applications and analytics. What that means is that now you can follow all of our MAA best practices for GoldenGate for Oracle, but it also works for the DAA product as well, meaning that if it's running on one node of a cluster and that node fails, it'll restart itself on another node in the cluster. We've also added the ability to deliver data to Redis, Google BigQuery, stage and merge functionality for better performance into the BigQuery product. And then we've added a completely new feature, and this is something called streaming data and apps and we're calling it AsyncAPI and CloudEvent data streaming. It's a long name, but what that means is that we now have the ability to publish changes from a GoldenGate trail file out to end users. And so this allows through the Web UI or through the REST API, you can now come into GoldenGate and through the distributed applications and analytics product, actually set up a subscription to a GoldenGate trail file. And so this allows us to push data into messaging environments, or you can simply subscribe to changes and it doesn't have to be the whole trail file, it can just be a subset. You can specify exactly which tables and you can put filters on that. You can also set up your topologies as well. So, it's a really cool feature that we've added here. 08:26 Nikita: Ok, you’ve given us a lot of updates about what GoldenGate can support. But can we also get some specifics? Nick: So as far as what we have, on the Oracle Database side, there's a ton of different Oracle databases we support, including the Autonomous Databases and all the different flavors of them, your Oracle Database Appliance, your Base Database Service within OCI, your of course, Standard and Enterprise Edition, as well as all the different flavors of Exadata, are all supported with GoldenGate. This is all for capture and delivery. And this is all versions as well. GoldenGate supports Oracle 23ai and below. We also have a ton of non-Oracle databases in different Cloud stores. On an non-Oracle side, we support everything from application-specific databases like FairCom DB, all the way to more advanced applications like Snowflake, which there's a vast user base for that. We also support a lot of different cloud stores and these again, are non-Oracle, nonrelational systems, or they can be relational databases. We also support a lot of big data platforms and this is part of the distributed applications and analytics side of things where you have the ability to replicate to different Apache environments, different Cloudera environments. We also support a number of open-source systems, including things like Apache Cassandra, MySQL Community Edition, a lot of different Postgres open source databases along with MariaDB. And then we have a bunch of streaming event products, NoSQL data stores, and even Oracle applications that we support. So there's absolutely a ton of different environments that GoldenGate supports. There are additional Oracle databases that we support and this includes the Oracle Metadata Service, as well as Oracle MySQL, including MySQL HeatWave. Oracle also has Oracle NoSQL Spatial and Graph and times 10 products, which again are all supported by GoldenGate. 10:23 Lois: Wow, that’s a lot of information! Nick: One of the things that we didn't really cover was the different SaaS applications, which we've got like Cerner, Fusion Cloud, Hospitality, Retail, MICROS, Oracle Transportation, JD Edwards, Siebel, and on and on and on.  And again, because of the nature of GoldenGate, it's heterogeneous. Any source can talk to any target. And so it doesn't have to be, oh, I'm pulling from Oracle Fusion Cloud, that means I have to go to an Oracle Database on the target, not necessarily.  10:51 Lois: So, there’s really a massive amount of flexibility built into the system.  11:00 Unlock the power of AI Vector Search with our new course and certification. Get more accurate search results, handle complex datasets easily, and supercharge your data-driven decisions. From now through May 15, 2025, we are waiving the certification exam fee (valued at $245). Visit mylearn.oracle.com to enroll. 11:26 Nikita: Welcome back! Now that we’ve gone through the base product, what other features or products are in the GoldenGate family itself, Nick? Nick: So we have quite a few. We've kind of touched already on GoldenGate for Oracle databases and non-Oracle databases. We also have something called GoldenGate for Mainframe, which right now is covered under the GoldenGate for non-Oracle, but there is a licensing difference there. So that's something to be aware of. We also have the OCI GoldenGate product. We are announcing and we have announced that OCI GoldenGate will also be made available as part of the Oracle Database@Azure and Oracle Database@ Google Cloud partnerships.  And then you'll be able to use that vendor's cloud credits to actually pay for the OCI GoldenGate product. One of the cool things about this is it will have full feature parity with OCI GoldenGate running in OCI. So all the same features, all the same sources and targets, all the same topologies be able to migrate data in and out of those clouds at will, just like you do with OCI GoldenGate today running in OCI.  We have Oracle GoldenGate Free.  This is a completely free edition of GoldenGate to use. It is limited on the number of platforms that it supports as far as sources and targets and the size of the database.  12:45 Lois: But it's a great way for developers to really experience GoldenGate without worrying about a license, right? What’s next, Nick? Nick: We have GoldenGate for Distributed Applications and Analytics, which was formerly called GoldenGate for big data, and that allows us to do all the streaming. That's also where the GoldenGate AsyncAPI integration is done. So in order to publish the GoldenGate trail files or allow people to subscribe to them, it would be covered under the Oracle GoldenGate Distributed Applications and Analytics license. We also have OCI GoldenGate Marketplace, which allows you to run essentially the on-premises version of GoldenGate but within OCI. So a little bit more flexibility there. It also has a hub architecture. So if you need that 99.99% availability, you can get it within the OCI Marketplace environment. We have GoldenGate for Oracle Enterprise Manager Cloud Control, which used to be called Oracle Enterprise Manager. And this allows you to use Enterprise Manager Cloud Control to get all the statistics and details about GoldenGate. So all the reporting information, all the analytics, all the statistics, how fast GoldenGate is replicating, what's the lag, what's the performance of each of the processes, how much data am I sending across a network. All that's available within the plug-in. We also have Oracle GoldenGate Veridata. This is a nice utility and tool that allows you to compare two databases, whether or not GoldenGate is running between them and actually tell you, hey, these two systems are out of sync. And if they are out of sync, it actually allows you to repair the data too. 14:25 Nikita: That’s really valuable…. Nick: And it does this comparison without locking the source or the target tables. The other really cool thing about Veridata is it does this while there's data in flight. So let's say that the GoldenGate lag is 15 or 20 seconds and I want to compare this table that has 10 million rows in it. The Veridata product will go out, run its comparison once. Once that comparison is done the first time, it's then going to have a list of rows that are potentially out of sync. Well, some of those rows could have been moved over or could have been modified during that 10 to 15 second window. And so the next time you run Veridata, it's actually going to go through. It's going to check just those rows that were potentially out of sync to see if they're really out of sync or not. And if it comes back and says, hey, out of those potential rows, there's two out of sync, it'll actually produce a script that allows you to resynchronize those systems and repair them. So it's a very cool product.  15:19 Nikita: What about GoldenGate Stream Analytics? I know you mentioned it in the last episode, but in the context of this discussion, can you tell us a little more about it?  Nick: This is the ability to essentially stream data from a GoldenGate trail file, and they do a real time analytics on it. And also things like geofencing or real-time series analysis of it.  15:40 Lois: Could you give us an example of this? Nick: If I'm working in tracking stock market information and stocks, it's not really that important on how much or how far down a stock goes. What's really important is how quickly did that stock rise or how quickly did that stock fall. And that's something that GoldenGate Stream Analytics product can do. Another thing that it's very valuable for is the geofencing. I can have an application on my phone and I can track where the user is based on that application and all that information goes into a database. I can then use the geofencing tool to say that, hey, if one of those users on that app gets within a certain distance of one of my brick-and-mortar stores, I can actually send them a push notification to say, hey, come on in and you can order your favorite drink just by clicking Yes, and we'll have it ready for you. And so there's a lot of things that you can do there to help upsell your customers and to get more revenue just through GoldenGate itself. And then we also have a GoldenGate Migration Utility, which allows customers to migrate from the classic architecture into the microservices architecture. 16:44 Nikita: Thanks Nick for that comprehensive overview.  Lois: In our next episode, we’ll have Nick back with us to talk about commonly used terminology and the GoldenGate architecture. And if you want to learn more about what we discussed today, visit mylearn.oracle.com and take a look at the Oracle GoldenGate 23ai Fundamentals course. Until next time, this is Lois Houston… Nikita: And Nikita Abraham, signing off! 17:10 That’s all for this episode of the Oracle University Podcast. If you enjoyed listening, please click Subscribe to get all the latest episodes. We’d also love it if you would take a moment to rate and review us on your podcast app. See you again on the next episode of the Oracle University Podcast.

6 Touko 17min

What is Oracle GoldenGate 23ai?

What is Oracle GoldenGate 23ai?

In a new season of the Oracle University Podcast, Lois Houston and Nikita Abraham dive into the world of Oracle GoldenGate 23ai, a cutting-edge software solution for data management. They are joined by Nick Wagner, a seasoned expert in database replication, who provides a comprehensive overview of this powerful tool.   Nick highlights GoldenGate's ability to ensure continuous operations by efficiently moving data between databases and platforms with minimal overhead. He emphasizes its role in enabling real-time analytics, enhancing data security, and reducing costs by offloading data to low-cost hardware. The discussion also covers GoldenGate's role in facilitating data sharing, improving operational efficiency, and reducing downtime during outages.   Oracle GoldenGate 23ai: Fundamentals: https://mylearn.oracle.com/ou/course/oracle-goldengate-23ai-fundamentals/145884/237273 Oracle University Learning Community: https://education.oracle.com/ou-community LinkedIn: https://www.linkedin.com/showcase/oracle-university/ X: https://x.com/Oracle_Edu   Special thanks to Arijit Ghosh, David Wright, Kris-Ann Nansen, Radhika Banka, and the OU Studio Team for helping us create this episode. ---------------------------------------------------------------   Episode Transcript: 00:00 Welcome to the Oracle University Podcast, the first stop on your cloud journey. During this series of informative podcasts, we’ll bring you foundational training on the most popular Oracle technologies. Let’s get started! 00:25 Nikita: Welcome to the Oracle University Podcast! I’m Nikita Abraham, Team Lead: Editorial Services with Oracle University, and with me is Lois Houston: Director of Innovation Programs. Lois: Hi everyone! Welcome to a new season of the podcast. This time, we’re focusing on the fundamentals of Oracle GoldenGate. Oracle GoldenGate helps organizations manage and synchronize their data across diverse systems and databases in real time.  And with the new Oracle GoldenGate 23ai release, we’ll uncover the latest innovations and features that empower businesses to make the most of their data. Nikita: Taking us through this is Nick Wagner, Senior Director of Product Management for Oracle GoldenGate. He’s been doing database replication for about 25 years and has been focused on GoldenGate on and off for about 20 of those years.  01:18 Lois: In today’s episode, we’ll ask Nick to give us a general overview of the product, along with some use cases and benefits. Hi Nick! To start with, why do customers need GoldenGate? Nick: Well, it delivers continuous operations, being able to continuously move data from one database to another database or data platform in efficiently and a high-speed manner, and it does this with very low overhead. Almost all the GoldenGate environments use transaction logs to pull the data out of the system, so we're not creating any additional triggers or very little overhead on that source system. GoldenGate can also enable real-time analytics, being able to pull data from all these different databases and move them into your analytics system in real time can improve the value that those analytics systems provide. Being able to do real-time statistics and analysis of that data within those high-performance custom environments is really important. 02:13 Nikita: Does it offer any benefits in terms of cost?  Nick: GoldenGate can also lower IT costs. A lot of times people run these massive OLTP databases, and they are running reporting in those same systems. With GoldenGate, you can offload some of the data or all the data to a low-cost commodity hardware where you can then run the reports on that other system. So, this way, you can get back that performance on the OLTP system, while at the same time optimizing your reporting environment for those long running reports. You can improve efficiencies and reduce risks. Being able to reduce the amount of downtime during planned and unplanned outages can really make a big benefit to the overall operational efficiencies of your company.  02:54 Nikita: What about when it comes to data sharing and data security? Nick: You can also reduce barriers to data sharing. Being able to pull subsets of data, or just specific pieces of data out of a production database and move it to the team or to the group that needs that information in real time is very important. And it also protects the security of your data by only moving in the information that they need and not the entire database. It also provides extensibility and flexibility, being able to support multiple different replication topologies and architectures. 03:24 Lois: Can you tell us about some of the use cases of GoldenGate? Where does GoldenGate truly shine?  Nick: Some of the more traditional use cases of GoldenGate include use within the multicloud fabric. Within a multicloud fabric, this essentially means that GoldenGate can replicate data between on-premise environments, within cloud environments, or hybrid, cloud to on-premise, on-premise to cloud, or even within multiple clouds. So, you can move data from AWS to Azure to OCI. You can also move between the systems themselves, so you don't have to use the same database in all the different clouds. For example, if you wanted to move data from AWS Postgres into Oracle running in OCI, you can do that using Oracle GoldenGate. We also support maximum availability architectures. And so, there's a lot of different use cases here, but primarily geared around reducing your recovery point objective and recovery time objective. 04:20 Lois: Ah, reducing RPO and RTO. That must have a significant advantage for the customer, right? Nick: So, reducing your RPO and RTO allows you to take advantage of some of the benefits of GoldenGate, being able to do active-active replication, being able to set up GoldenGate for high availability, real-time failover, and it can augment your active Data Guard and Data Guard configuration. So, a lot of times GoldenGate is used within Oracle's maximum availability architecture platinum tier level of replication, which means that at that point you've got lots of different capabilities within the Oracle Database itself. But to help eke out that last little bit of high availability, you want to set up an active-active environment with GoldenGate to really get true zero RPO and RTO. GoldenGate can also be used for data offloading and data hubs. Being able to pull data from one or more source systems and move it into a data hub, or into a data warehouse for your operational reporting. This could also be your analytics environment too. 05:22 Nikita: Does GoldenGate support online migrations? Nick: In fact, a lot of companies actually get started in GoldenGate by doing a migration from one platform to another. Now, these don't even have to be something as complex as going from one database like a DB2 on-premise into an Oracle on OCI, it could even be simple migrations. A lot of times doing something like a major application or a major database version upgrade is going to take downtime on that production system. You can use GoldenGate to eliminate that downtime. So this could be going from Oracle 19c to Oracle 23ai, or going from application version 1.0 to application version 2.0, because GoldenGate can do the transformation between the different application schemas. You can use GoldenGate to migrate your database from on premise into the cloud with no downtime as well. We also support real-time analytic feeds, being able to go from multiple databases, not only those on premise, but being able to pull information from different SaaS applications inside of OCI and move it to your different analytic systems. And then, of course, we also have the ability to stream events and analytics within GoldenGate itself.  06:34 Lois: Let's move on to the various topologies supported by GoldenGate. I know GoldenGate supports many different platforms and can be used with just about any database. Nick: This first layer of topologies is what we usually consider relational database topologies. And so this would be moving data from Oracle to Oracle, Postgres to Oracle, Sybase to SQL Server, a lot of different types of databases. So the first architecture would be unidirectional. This is replicating from one source to one target. You can do this for reporting. If I wanted to offload some reports into another server, I can go ahead and do that using GoldenGate. I can replicate the entire database or just a subset of tables. I can also set up GoldenGate for bidirectional, and this is what I want to set up GoldenGate for something like high availability. So in the event that one of the servers crashes, I can almost immediately reconnect my users to the other system. And that almost immediately depends on the amount of latency that GoldenGate has at that time. So a typical latency is anywhere from 3 to 6 seconds. So after that primary system fails, I can reconnect my users to the other system in 3 to 6 seconds. And I can do that because as GoldenGate’s applying data into that target database, that target system is already open for read and write activity. GoldenGate is just another user connecting in issuing DML operations, and so it makes that failover time very low. 07:59 Nikita: Ok…If you can get it down to 3 to 6 seconds, can you bring it down to zero? Like zero failover time?   Nick: That's the next topology, which is active-active. And in this scenario, all servers are read/write all at the same time and all available for user activity. And you can do multiple topologies with this as well. You can do a mesh architecture, which is where every server talks to every other server. This works really well for 2, 3, 4, maybe even 5 environments, but when you get beyond that, having every server communicate with every other server can get a little complex. And so at that point we start looking at doing what we call a hub and spoke architecture, where we have lots of different spokes. At the end of each spoke is a read/write database, and then those communicate with a hub. So any change that happens on one spoke gets sent into the hub, and then from the hub it gets sent out to all the other spokes. And through that architecture, it allows you to really scale up your environments. We have customers that are doing up to 150 spokes within that hub architecture. Within active-active replication as well, we can do conflict detection and resolution, which means that if two users modify the same row on two different systems, GoldenGate can actually determine that there was an issue with that and determine what user wins or which row change wins, which is extremely important when doing active-active replication. And this means that if one of those systems fails, there is no downtime when you switch your users to another active system because it's already available for activity and ready to go. 09:35 Lois: Wow, that’s fantastic. Ok, tell us more about the topologies. Nick: GoldenGate can do other things like broadcast, sending data from one system to multiple systems, or many to one as far as consolidation. We can also do cascading replication, so when data moves from one environment that GoldenGate is replicating into another environment that GoldenGate is replicating. By default, we ignore all of our own transactions. But there's actually a toggle switch that you can flip that says, hey, GoldenGate, even though you wrote that data into that database, still push it on to the next system. And then of course, we can also do distribution of data, and this is more like moving data from a relational database into something like a Kafka topic or a JMS queue or into some messaging service. 10:24 Raise your game with the Oracle Cloud Applications skills challenge. Get free training on Oracle Fusion Cloud Applications, Oracle Modern Best Practice, and Oracle Cloud Success Navigator. Pass the free Oracle Fusion Cloud Foundations Associate exam to earn a Foundations Associate certification. Plus, there’s a chance to win awards and prizes throughout the challenge! What are you waiting for? Join the challenge today by visiting visit oracle.com/education. 10:58 Nikita: Welcome back! Nick, does GoldenGate also have nonrelational capabilities?  Nick: We have a number of nonrelational replication events in topologies as well. This includes things like data lake ingestion and streaming ingestion, being able to move data and data objects from these different relational database platforms into data lakes and into these streaming systems where you can run analytics on them and run reports. We can also do cloud ingestion, being able to move data from these databases into different cloud environments. And this is not only just moving it into relational databases with those clouds, but also their data lakes and data fabrics. 11:38 Lois: You mentioned a messaging service earlier. Can you tell us more about that? Nick: Messaging replication is also possible. So we can actually capture from things like messaging systems like Kafka Connect and JMS, replicate that into a relational data, or simply stream it into another environment. We also support NoSQL replication, being able to capture from MongoDB and replicate it onto another MongoDB for high availability or disaster recovery, or simply into any other system. 12:06 Nikita: I see. And is there any integration with a customer’s SaaS applications? Nick: GoldenGate also supports a number of different OCI SaaS applications. And so a lot of these different applications like Oracle Financials Fusion, Oracle Transportation Management, they all have GoldenGate built under the covers and can be enabled with a flag that you can actually have that data sent out to your other GoldenGate environment. So you can actually subscribe to changes that are happening in these other systems with very little overhead. And then of course, we have event processing and analytics, and this is the final topology or flexibility within GoldenGate itself. And this is being able to push data through data pipelines, doing data transformations. GoldenGate is not an ETL tool, but it can do row-level transformation and row-level filtering.  12:55 Lois: Are there integrations offered by Oracle GoldenGate in automation and artificial intelligence? Nick: We can do time series analysis and geofencing using the GoldenGate Stream Analytics product. It allows you to actually do real time analysis and time series analysis on data as it flows through the GoldenGate trails. And then that same product, the GoldenGate Stream Analytics, can then take the data and move it to predictive analytics, where you can run MML on it, or ONNX or other Spark-type technologies and do real-time analysis and AI on that information as it's flowing through.  13:29 Nikita: So, GoldenGate is extremely flexible. And given Oracle's focus on integrating AI into its product portfolio, what about GoldenGate? Does it offer any AI-related features, especially since the product name has “23ai” in it? Nick: With the advent of Oracle GoldenGate 23ai, it's one of the two products at this point that has the AI moniker at Oracle. Oracle Database 23ai also has it, and that means that we actually do stuff with AI. So the Oracle GoldenGate product can actually capture vectors from databases like MySQL HeatWave, Postgres using pgvector, which includes things like AlloyDB, Amazon RDS Postgres, Aurora Postgres. We can also replicate data into Elasticsearch and OpenSearch, or if the data is using vectors within OCI or the Oracle Database itself. So GoldenGate can be used for a number of things here. The first one is being able to migrate vectors into the Oracle Database. So if you're using something like Postgres, MySQL, and you want to migrate the vector information into the Oracle Database, you can. Now one thing to keep in mind here is a vector is oftentimes like a GPS coordinate. So if I need to know the GPS coordinates of Austin, Texas, I can put in a latitude and longitude and it will give me the GPS coordinates of a building within that city. But if I also need to know the altitude of that same building, well, that's going to be a different algorithm. And GoldenGate and replicating vectors is the same way. When you create a vector, it's essentially just creating a bunch of numbers under the screen, kind of like those same GPS coordinates. The dimension and the algorithm that you use to generate that vector can be different across different databases, but the actual meaning of that data will change. And so GoldenGate can replicate the vector data as long as the algorithm and the dimensions are the same. If the algorithm and the dimensions are not the same between the source and the target, then you'll actually want GoldenGate to replicate the base data that created that vector. And then once GoldenGate replicates the base data, it'll actually call the vector embedding technology to re-embed that data and produce that numerical formatting for you.  15:42 Lois: So, there are some nuances there… Nick: GoldenGate can also replicate and consolidate vector changes or even do the embedding API calls itself. This is really nice because it means that we can take changes from multiple systems and consolidate them into a single one. We can also do the reverse of that too. A lot of customers are still trying to find out which algorithms work best for them. How many dimensions? What's the optimal use? Well, you can now run those in different servers without impacting your actual AI system. Once you've identified which algorithm and dimension is going to be best for your data, you can then have GoldenGate replicate that into your production system and we'll start using that instead. So it's a nice way to switch algorithms without taking extensive downtime. 16:29 Nikita: What about in multicloud environments?  Nick: GoldenGate can also do multicloud and N-way active-active Oracle replication between vectors. So if there's vectors in Oracle databases, in multiple clouds, or multiple on-premise databases, GoldenGate can synchronize them all up. And of course we can also stream changes from vector information, including text as well into different search engines. And that's where the integration with Elasticsearch and OpenSearch comes in. And then we can use things like NVIDIA and Cohere to actually do the AI on that data.  17:01 Lois: Using GoldenGate with AI in the database unlocks so many possibilities. Thanks for that detailed introduction to Oracle GoldenGate 23ai and its capabilities, Nick.  Nikita: We’ve run out of time for today, but Nick will be back next week to talk about how GoldenGate has evolved over time and its latest features. And if you liked what you heard today, head over to mylearn.oracle.com and take a look at the Oracle GoldenGate 23ai Fundamentals course to learn more. Until next time, this is Nikita Abraham… Lois: And Lois Houston, signing off! 17:33 That’s all for this episode of the Oracle University Podcast. If you enjoyed listening, please click Subscribe to get all the latest episodes. We’d also love it if you would take a moment to rate and review us on your podcast app. See you again on the next episode of the Oracle University Podcast.

29 Huhti 18min

Integrating APEX with OCI AI Services

Integrating APEX with OCI AI Services

Discover how Oracle APEX leverages OCI AI services to build smarter, more efficient applications. Hosts Lois Houston and Nikita Abraham interview APEX experts Chaitanya Koratamaddi, Apoorva Srinivas, and Toufiq Mohammed about how key services like OCI Vision, Oracle Digital Assistant, and Document Understanding integrate with Oracle APEX.   Packed with real-world examples, this episode highlights all the ways you can enhance your APEX apps.   Oracle APEX: Empowering Low Code Apps with AI: https://mylearn.oracle.com/ou/course/oracle-apex-empowering-low-code-apps-with-ai/146047/ Oracle University Learning Community: https://education.oracle.com/ou-community LinkedIn: https://www.linkedin.com/showcase/oracle-university/ X: https://x.com/Oracle_Edu   Special thanks to Arijit Ghosh, David Wright, Kris-Ann Nansen, Radhika Banka, and the OU Studio Team for helping us create this episode.   ---------------------------------------------------------------   Episode Transcript: 00:00 Welcome to the Oracle University Podcast, the first stop on your cloud journey. During this series of informative podcasts, we’ll bring you foundational training on the most popular Oracle technologies. Let’s get started! 00:25 Lois: Hello and welcome to the Oracle University Podcast. I’m Lois Houston, Director of Innovation Programs with Oracle University, and with me is Nikita Abraham, Team Lead: Editorial Services. Nikita: Hi everyone! Last week, we looked at how generative AI powers Oracle APEX and in today’s episode, we’re going to focus on integrating APEX with OCI AI Services. Lois: That’s right, Niki. We’re going to look at how you can use Oracle AI services like OCI Vision, Oracle Digital Assistant, Document Understanding, OCI Generative AI, and more to enhance your APEX apps. 01:03 Nikita: And to help us with it all, we’ve got three amazing experts with us, Chaitanya Koratamaddi, Director of Product Management at Oracle, and senior product managers, Apoorva Srinivas and Toufiq Mohammed. In today’s episode, we’ll go through each Oracle AI service and look at how it interacts with APEX. Apoorva, let’s start with you. Can you explain what the OCI Vision service is? Apoorva: Oracle Cloud Infrastructure Vision is a serverless multi-tenant service accessible using the console or REST APIs. You can upload images to detect and classify objects in them. With prebuilt models available, developers can quickly build image recognition into their applications without machine learning expertise. OCI Vision service provides a fully managed model infrastructure. With complete integration with OCI Data Labeling, you can build custom models easily. OCI Vision service provides pretrained models-- Image Classification, Object Detection, Face Detection, and Text Recognition. You can build custom models for Image Classification and Object Detection. 02:24 Lois: Ok. What about its use cases? How can OCI Vision make APEX apps more powerful? Apoorva: Using OCI Vision, you can make images and videos discoverable and searchable in your APEX app.  You can use OCI Vision to detect and classify objects in the images. OCI Vision also highlights the objects using a red rectangular box. This comes in handy in use cases such as detecting vehicles that have violated the rules in traffic images. You can use OCI Vision to identify visual anomalies in your data. This is a very popular use case where you can detect anomalies in cancer X-ray images to detect cancer. These are some of the most popular use cases of using OCI Vision with your APEX app. But the possibilities are endless and you can use OCI Vision for any of your image analysis. 03:29 Nikita: Let’s shift gears to Oracle Digital Assistant. Chaitanya, can you tell us what it’s all about? Chaitanya: Oracle Digital Assistant is a low-code conversational AI platform that allows businesses to build and deploy AI assistants. It provides natural language understanding, automatic speech recognition, and text-to-speech capabilities to enable human-like interactions with customers and employees. Oracle Digital Assistant comes with prebuilt templates for you to get started.  04:00 Lois: What are its key features and benefits, Chaitanya? How does it enhance the user experience? Chaitanya: Oracle Digital Assistant provides conversational AI capabilities that include generative AI features, natural language understanding and ML, AI-powered voice, and analytics and insights. Integration with enterprise applications become easier with unified conversational experience, prebuilt chatbots for Oracle Cloud applications, and chatbot architecture frameworks. Oracle Digital Assistant provides advanced conversational design tools, conversational designer, dialogue and domain trainer, and native multilingual support. Oracle Digital Assistant is open, scalable, and secure. It provides multi-channel support, automated bot-to-agent transfer, and integrated authentication profile. 04:56 Nikita: And what about the architecture? What happens at the back end? Chaitanya: Developers assemble digital assistants from one or more skills. Skills can be based on prebuilt skills provided by Oracle or third parties, custom developed, or based on one of the many skill templates available. 05:16 Lois: Chaitanya, what exactly are “skills” within the Oracle Digital Assistant framework?  Chaitanya: Skills are individual chatbots that are designed to interact with users and fulfill specific type of tasks. Each skill helps a user complete a task through a combination of text messages and simple UI elements like select list. When a user request is submitted through a channel, the Digital Assistant routes the user's request to the most appropriate skill to satisfy the user's request. Skills can combine multilingual NLP deep learning engine, a powerful dialogflow engine, and integration components to connect to back-end systems.  Skills provide a modular way to build your chatbot functionality. Now users connect with a chatbot through channels such as Facebook, Microsoft Teams, or in our case, Oracle APEX chatbot, which is embedded into an APEX application. 06:21 Nikita: That’s fascinating. So, what are some use cases of Oracle Digital Assistant in APEX apps? Chaitanya: Digital assistants streamline approval processes by collecting information, routing requests, and providing status updates. Digital assistants offer instant access to information and documentation, answering common questions and guiding users. Digital assistants assist sales teams by automating tasks, responding to inquiries, and guiding prospects through the sales funnel. Digital assistants facilitate procurement by managing orders, tracking deliveries, and handling supplier communication. Digital assistants simplify expense approvals by collecting reports, validating receipts, and routing them for managerial approval. Digital assistants manage inventory by tracking stock levels, reordering supplies, and providing real-time inventory updates. Digital assistants have become a common UX feature in any enterprise application. 07:28 Want to learn how to design stunning, responsive enterprise applications directly from your browser with minimal coding? The new Oracle APEX Developer Professional learning path and certification enables you to leverage AI-assisted development, including generative AI and Database 23ai, to build secure, scalable web and mobile applications with advanced AI-powered features. From now through May 15, 2025, we’re waiving the certification exam fee (valued at $245). So, what are you waiting for? Visit mylearn.oracle.com to get started today. 08:09 Nikita: Welcome back! Thanks for that, Chaitanya. Toufiq, let’s talk about the OCI Document Understanding service. What is it? Toufiq: Using this service, you can upload documents to extract text, tables, and other key data. This means the service can automatically identify and extract relevant information from various types of documents, such as invoices, receipts, contracts, etc. The service is serverless and multitenant, which means you don't need to manage any servers or infrastructure. You can access this service using the console, REST APIs, SDK, or CLI, giving you multiple ways to integrate. 08:55 Nikita: What do we use for APEX apps?  Toufiq: For APEX applications, we will be using REST APIs to integrate the service. Additionally, you can process individual files or batches of documents using the ProcessorJob API endpoint. This flexibility allows you to handle different volumes of documents efficiently, whether you need to process a single document or thousands at once. With these capabilities, the OCI Document Understanding service can significantly streamline your document processing tasks, saving time and reducing the potential for manual errors. 09:36 Lois: Ok.  What are the different types of models available? How do they cater to various business needs? Toufiq: Let us start with pre-trained models. These are ready-to-use models that come right out of the box, offering a range of functionalities. The available models are Optical Character Recognition (OCR) enables the service to extract text from documents, allowing you to digitize, scan the documents effortlessly. You can precisely extract text content from documents. Key-value extraction, useful in streamlining tasks like invoice processing. Table extraction can intelligently extract tabular data from documents. Document classification automatically categorizes documents based on their content. OCR PDF enables seamless extraction of text from PDF files. Now, what if your business needs go beyond these pre-trained models. That's where custom models come into play. You have the flexibility to train and build your own models on top of these foundational pre-trained models. Models available for training are key value extraction and document classification. 10:50 Nikita: What does the architecture look like for OCI Document Understanding? Toufiq: You can ingest or supply the input file in two different ways. You can upload the file to an OCI Object Storage location. And in your request, you can point the Document Understanding service to pick the file from this Object Storage location.  Alternatively, you can upload a file directly from your computer. Once the file is uploaded, the Document Understanding service can process the file and extract key information using the pre-trained models. You can also customize models to tailor the extraction to your data or use case. After processing the file, the Document Understanding service stores the results in JSON format in the Object Storage output bucket. Your Oracle APEX application can then read the JSON file from the Object Storage output location, parse the JSON, and store useful information at local table or display it on the screen to the end user. 11:52 Lois: And what about use cases? How are various industries using this service? Toufiq: In financial services, you can utilize Document Understanding to extract data from financial statements, classify and categorize transactions, identify and extract payment details, streamline tax document management. Under manufacturing, you can perform text extraction from shipping labels and bill of lading documents, extract data from production reports, identify and extract vendor details. In the healthcare industry, you can automatically process medical claims, extract patient information from forms, classify and categorize medical records, identify and extract diagnostic codes. This is not an exhaustive list, but provides insights into some industry-specific use cases for Document Understanding. 12:50 Nikita: Toufiq, let’s switch to the big topic everyone’s excited about—the OCI Generative AI Service. What exactly is it? Toufiq: OCI Generative AI is a fully managed service that provides a set of state of the art, customizable large language models that cover a wide range of use cases. It provides enterprise grade generative AI with data governance and security, which means only you have access to your data and custom-trained models. OCI Generative AI provides pre-trained out-of-the-box LLMs for text generation, summarization, and text embedding. OCI Generative AI also provides necessary tools and infrastructure to define models with your own business knowledge. 13:37 Lois: Generally speaking, how is OCI Generative AI useful?  Toufiq: It supports various large language models. New models available from Meta and Cohere include Llama2 developed by Meta, and Cohere's Command model, their flagship text generation model. Additionally, Cohere offers the Summarize model, which provides high-quality summaries, accurately capturing essential information from documents, and the Embed model, converting text to vector embeddings representation. OCI Generative AI also offers dedicated AI clusters, enabling you to host foundational models on private GPUs. It integrates LangChain and open-source framework for developing new interfaces for generative AI applications powered by language models. Moreover, OCI Generative AI facilitates generative AI operations, providing content moderation controls, zero downtime endpoint model swaps, and endpoint deactivation and activation capabilities. For each model endpoint, OCI Generative AI captures a series of analytics, including call statistics, tokens processed, and error counts. 14:58 Nikita: What about the architecture? How does it handle user input? Toufiq: Users can input natural language, input/output examples, and instructions. The LLM analyzes the text and can generate, summarize, transform, extract information, or classify text according to the user's request. The response is sent back to the user in the specified format, which can include raw text or formatting like bullets and numbering, etc. 15:30 Lois: Can you share some practical use cases for generative AI in APEX apps?  Toufiq: Some of the OCI generative AI use cases for your Oracle APEX apps include text summarization. Generative AI can quickly summarize lengthy documents such as articles, transcripts, doctor's notes, and internal documents. Businesses can utilize generative AI to draft marketing copy, emails, blog posts, and product descriptions efficiently. Generative AI-powered chatbots are capable of brainstorming, problem solving, and answering questions. With generative AI, content can be rewritten in different styles or languages. This is particularly useful for localization efforts and catering to diverse audience. Generative AI can classify intent in customer chat logs, support tickets, and more. This helps businesses understand customer needs better and provide tailored responses and solutions. By searching call transcripts, internal knowledge sources, Generative AI enables businesses to efficiently answer user queries. This enhances information retrieval and decision-making processes. 16:47 Lois: Before we let you go, can you explain what Select AI is? How is it different from the other AI services? Toufiq: Select AI is a feature of Autonomous Database. This is where Select AI differs from the other AI services. Be it OCI Vision, Document Understanding, or OCI Generative AI, these are all freely managed standalone services on Oracle Cloud, accessible via REST APIs. Whereas Select AI is a feature available in Autonomous Database. That means to use Select AI, you need Autonomous Database.  17:26 Nikita: And what can developers do with Select AI? Toufiq: Traditionally, SQL is the language used to query the data in the database. With Select AI, you can talk to the database and get insights from the data in the database using human language. At the very basic, what Select AI does is it generates SQL queries using natural language, like an NL2SQL capability.  17:52 Nikita: How does it actually do that? Toufiq: When a user asks a question, the first step Select AI does is look into the AI profile, which you, as a developer, define. The AI profile holds crucial information, such as table names, the LLM provider, and the credentials needed to authenticate with the LLM service. Next, Select AI constructs a prompt. This prompt includes information from the AI profile and the user's question.  Essentially, it's a packet of information containing everything the LLM service needs to generate SQL. The next step is generating SQL using LLM. The prompt prepared by Select AI is sent to the available LLM services via REST. Which LLM to use is configured in the AI profile. The supported providers are OpenAI, Cohere, Azure OpenAI, and OCI Generative AI. Once the SQL is generated by the LLM service, it is returned to the application. The app can then handle the SQL query in various ways, such as displaying the SQL results in a report format or as charts, etc.  19:05 Lois: This has been an incredible discussion! Thank you, Chaitanya, Apoorva, and Toufiq, for walking us through all of these amazing AI tools. If you’re ready to dive deeper, visit mylearn.oracle.com and search for the Oracle APEX: Empowering Low Code Apps with AI course. You’ll find step-by-step guides and demos for everything we covered today.  Nikita: Until next week, this is Nikita Abraham… Lois: And Lois Houston signing off! 19:31 That’s all for this episode of the Oracle University Podcast. If you enjoyed listening, please click Subscribe to get all the latest episodes. We’d also love it if you would take a moment to rate and review us on your podcast app. See you again on the next episode of the Oracle University Podcast.

22 Huhti 20min

AI-Assisted Development in Oracle APEX

AI-Assisted Development in Oracle APEX

Get ready to explore how generative AI is transforming development in Oracle APEX. In this episode, hosts Lois Houston and Nikita Abraham are joined by Oracle APEX experts Apoorva Srinivas and Toufiq Mohammed to break down the innovative features of APEX 24.1. Learn how developers can use APEX Assistant to build apps, generate SQL, and create data models using natural language prompts.   Oracle APEX: Empowering Low Code Apps with AI: https://mylearn.oracle.com/ou/course/oracle-apex-empowering-low-code-apps-with-ai/146047/ Oracle University Learning Community: https://education.oracle.com/ou-community LinkedIn: https://www.linkedin.com/showcase/oracle-university/ X: https://x.com/Oracle_Edu   Special thanks to Arijit Ghosh, David Wright, Kris-Ann Nansen, Radhika Banka, and the OU Studio Team for helping us create this episode.   --------------------------------------------------------------   Episode Transcript: 00:00 Welcome to the Oracle University Podcast, the first stop on your cloud journey. During this series of informative podcasts, we’ll bring you foundational training on the most popular Oracle technologies. Let’s get started! 00:25 Nikita: Welcome back to another episode of the Oracle University Podcast! I’m Nikita Abraham, Team Lead of Editorial Services with Oracle University, and I’m joined by Lois Houston, Director of Innovation Programs. Lois: Hi everyone! In our last episode, we spoke about Oracle APEX and AI. We covered the data and AI -centric challenges businesses are up against and explored how AI fits in with Oracle APEX. Niki, what’s in store for today? Nikita: Well, Lois, today we’re diving into how generative AI powers Oracle APEX. With APEX 24.1, developers can use the Create Application Wizard to tell APEX what kind of application they want to build based on available tables. Plus, APEX Assistant helps create, refine, and debug SQL code in natural language. 01:16 Lois: Right. Today’s episode will focus on how generative AI enhances development in APEX. We’ll explore its architecture, the different AI providers, and key use cases. Joining us are two senior product managers from Oracle—Apoorva Srinivas and Toufiq Mohammed. Thank you both for joining us today. We’ll start with you, Apoorva. Can you tell us a bit about the generative AI service in Oracle APEX? Apoorva: It is nothing but an abstraction to the popular commercial Generative AI products, like OCI Generative AI, OpenAI, and Cohere. APEX makes use of the existing REST infrastructure to authenticate using the web credentials with Generative AI Services. Once you configure the Generative AI Service, it can be used by the App Builder, AI Assistant, and AI Dynamic Actions, like Show AI Assistant and Generate Text with AI, and also the APEX_AI PL/SQL API. You can enable or disable the Generative AI Service on the APEX instance level and on the workspace level. 02:31 Nikita: Ok. Got it. So, Apoorva, which AI providers can be configured in the APEX Gen AI service? Apoorva: First is the popular OpenAI. If you have registered and subscribed for an OpenAI API key, you can just enter the API key in your APEX workspace to configure the Generative AI service. APEX makes use of the chat completions endpoint in OpenAI. Second is the OCI Generative AI Service. Once you have configured an OCI API key on Oracle Cloud, you can make use of the chat models. The chat models are available from Cohere family and Meta Llama family.  The third is the Cohere. The configuration of Cohere is similar to OpenAI. You need to have your Cohere OpenAI key. And it provides a similar chat functionality using the chat endpoint.  03:29 Lois: What is the purpose of the APEX_AI PL/SQL public API that we now have? How is it used within the APEX ecosystem? Apoorva: It models the chat operation of the popular Generative AI REST Services. This is the same package used internally by the chat widget of the APEX Assistant. There are more procedures around consent management, which you can configure using this package. 03:58 Lois: Apoorva, at a high level, how does generative AI fit into the APEX environment? Apoorva: APEX makes use of the existing REST infrastructure—that is the web credentials and remote server—to configure the Generative AI Service. The inferencing is done by the backend Generative AI Service. For the Generative AI use case in APEX, such as NL2SQL and creation of an app, APEX performs the prompt enrichment.  04:29 Nikita: And what exactly is prompt enrichment? Apoorva: Let's say you provide a prompt saying "show me the average salary of employees in each department." APEX will take this prompt and enrich it by adding in more details. It elaborates on the prompt by mentioning the requirements, such as Oracle SQL syntax statement, and providing some metadata from the data dictionary of APEX. Once the prompt enrichment is complete, it is then passed on to the LLM inferencing service. Therefore, the SQL query provided by the AI Assistant is more accurate and in context.  05:15 Unlock the power of AI Vector Search with our new course and certification. Get more accurate search results, handle complex datasets easily, and supercharge your data-driven decisions. From now to May 15, 2025, we are waiving the certification exam fee (valued at $245). Visit mylearn.oracle.com to enroll. 05:41  Nikita: Welcome back! Let’s talk use cases. Apoorva, can you share some ways developers can use generative AI with APEX? Apoorva: SQL is an integral part of building APEX apps. You use SQL everywhere. You can make use of the NL2SQL feature in the code editor by using the APEX Assistant to generate SQL queries while building the apps. The second is the prompt-based app creation. With APEX Assistant, you can now generate fully functional APEX apps by providing prompts in natural language. Third is the AI Assistant, which is a chat widget provided by APEX in all the code editors and for creation of apps. You can chat with the AI Assistant by providing your prompts and get responses from the Generative AI Services. 06:37 Lois: Without getting too technical, can you tell us how to create a data model using AI? Apoorva: A SQL Workshop utility called Create Data Model Using AI uses AI to help you create your own data model. The APEX Assistant generates a script to create tables, triggers, and constraints in either Oracle SQL or Quick SQL format. You can also insert sample data into these tables. But before you use this feature, you must create a generative AI service and enable the Used by App Builder setting. If you are using the Oracle SQL format, when you click on Create SQL Script, APEX generates the script and brings you to this script editor page. Whereas if you are using the Quick SQL format, when you click on Review Quick SQL, APEX generates the Quick SQL code and brings you to the Quick SQL page. 07:39 Lois: And to see a detailed demo of creating a custom data model with the APEX Assistant, visit mylearn.oracle.com and search for the "Oracle APEX: Empowering Low Code Apps with AI" course. Apoorva, what about creating an APEX app from a prompt. What’s that process like? Apoorva: APEX 24.1 introduces a new feature where you can generate an application blueprint based on a prompt using natural language. The APEX Assistant leverages the APEX Dictionary Cache to identify relevant tables while suggesting the pages to be created for your application. You can iterate over the application design by providing further prompts using natural language and then generating an application based on your needs. Once you are satisfied, you can click on Create Application, which takes you to the Create Application Wizard in APEX, where you can further customize your application, such as application icon and other features, and finally, go ahead to create your application. 08:53 Nikita: Again, you can watch a demo of this on MyLearn. So, check that out if you want to dive deeper.  Lois: That’s right, Niki. Thank you for these great insights, Apoorva! Now, let's turn to Toufiq. Toufiq, can you tell us more about the APEX Assistant feature in Oracle APEX. What is it and how does it work? Toufiq: APEX Assistant is available in Code Editors in the APEX App Builder. It leverages generative AI services as the backend to answer your questions asked in natural language. APEX Assistant makes use of the APEX dictionary cache to identify relevant tables while generating SQL queries. Using the Query Builder mode enables Assistant. You can generate SQL queries from natural language for Form, Report, and other region types which support SQL queries. Using the general assistance mode, you can generate PL/SQL JavaScript, HTML, or CSS Code, and seek further assistance from generative AI. For example, you can ask the APEX Assistant to optimize the code, format the code for better readability, add comments, etc. APEX Assistant also comes with two quick actions, Improve and Explain, which can help users improve and understand the selected code. 10:17 Nikita: What about the Show AI Assistant dynamic action? I know that it provides an AI chat interface, but can you tell us a little more about it?  Toufiq: It is a native dynamic action in Oracle APEX which renders an AI chat user interface. It leverages the generative AI services that are configured under Workspace utilities. This AI chat user interface can be rendered inline or as a dialog. This dynamic action also has configurable system prompt and welcome message attributes.  10:52 Lois: Are there attributes you can configure to leverage even more customization?  Toufiq: The first attribute is the initial prompt. The initial prompt represents a message as if it were coming from the user. This can either be a specific item value or a value derived from a JavaScript expression. The next attribute is use response. This attribute determines how the AI Assistant should return responses. The term response refers to the message content of an individual chat message. You have the option to capture this response directly into a page item, or to process it based on more complex logic using JavaScript code. The final attribute is quick actions. A quick action is a predefined phrase that, once clicked, will be sent as a user message. Quick actions defined here show up as chips in the AI chat interface, which a user can click to send the message to Generative AI service without having to manually type in the message. 12:05 Lois: Thank you, Toufiq and Apoorva, for joining us today. Like we were saying, there’s a lot more you can find in the “Oracle APEX: Empowering Low Code Apps with AI” course on MyLearn. So, make sure you go check that out. Nikita: Join us next week for a discussion on how to integrate APEX with OCI AI Services. Until then, this is Nikita Abraham… Lois: And Lois Houston signing off! 12:28 That’s all for this episode of the Oracle University Podcast. If you enjoyed listening, please click Subscribe to get all the latest episodes. We’d also love it if you would take a moment to rate and review us on your podcast app. See you again on the next episode of the Oracle University Podcast.

15 Huhti 12min

Unlocking the Power of Oracle APEX and AI

Unlocking the Power of Oracle APEX and AI

Lois Houston and Nikita Abraham kick off a new season of the podcast, exploring how Oracle APEX integrates with AI to build smarter low-code applications. They are joined by Chaitanya Koratamaddi, Director of Product Management at Oracle, who explains the basics of Oracle APEX, its global adoption, and the challenges it addresses for businesses managing and integrating data. They also explore real-world use cases of AI within the Oracle APEX ecosystem   Oracle APEX: Empowering Low Code Apps with AI: https://mylearn.oracle.com/ou/course/oracle-apex-empowering-low-code-apps-with-ai/146047/ Oracle University Learning Community: https://education.oracle.com/ou-community LinkedIn: https://www.linkedin.com/showcase/oracle-university/ X: https://x.com/Oracle_Edu   Special thanks to Arijit Ghosh, David Wright, Kris-Ann Nansen, Radhika Banka, and the OU Studio Team for helping us create this episode.   -----------------------------------------------------------------   Episode Transcript:   00:00 Welcome to the Oracle University Podcast, the first stop on your cloud journey. During this series of informative podcasts, we’ll bring you foundational training on the most popular Oracle technologies. Let’s get started! 00:25 Lois: Hello and welcome to the Oracle University Podcast! I’m Lois Houston, Director of Innovation Programs with Oracle University, and with me is Nikita Abraham, Team Lead: Editorial Services. Nikita: Hi everyone! Thank you for joining us as we begin a new season of the podcast, this time focused on Oracle APEX and how it integrates with AI to help you create powerful applications. This season is for everyone—from beginners and SQL developers to DBA data scientists and low-code enthusiasts. So, if you’re interested in using Oracle APEX to build low-code applications that have custom generative AI features, you’ll want to stay tuned in. 01:07 Lois: That’s right, Niki. Today, we're going to discuss Oracle APEX at a high level, starting with what it is. Then, we’ll cover a few business challenges related to data and AI innovation that organizations face, and learn how the powerful combination of APEX and AI can help overcome these challenges.  01:27 Nikita: To take us through it all, we’ve got Chaitanya Koratamaddi with us. Chaitanya is Director of Product Management for Oracle APEX. Hi Chaitanya! For anyone new to Oracle APEX, can you explain what it is and why it's so widely used? Chaitanya: Oracle APEX is the world's most popular enterprise low code application platform. APEX enables you to build secure and scalable enterprise-scale applications with world class features that can be deployed anywhere, cloud or on-premises. And with APEX, you can build applications 20 times faster with 100 times less code. APEX delivers the most productive way to develop and deploy mobile and web applications everywhere. 02:18 Lois: That’s impressive. So, what’s the adoption rate like for Oracle APEX? Chaitanya: As of today, there are 19 million plus APEX applications created globally. 5,000 plus APEX applications are created on a daily basis and there are 800,000 plus APEX developers worldwide. 60,000 plus customers in 150 countries across various industry verticals. And 75% of Fortune 500 companies use Oracle APEX. 02:56 Nikita: Wow, the numbers really speak for themselves, right? But Chaitanya, why are organizations adopting Oracle APEX at this scale? Or to put it differently, what’s the core business challenge that Oracle APEX is addressing? Chaitanya: From databases to all data, you know that the world is more connected and automated than ever. To drive new business value, organizations need to explore and exploit new sources of data that are generated from this connected world. That can be sounds, feeds, sensors, videos, images, and more. Businesses need to be able to work with all types of data and also make sure that it is available to be used together. Typically, businesses need to work on all data at a massive scale. For example, supply chains are no longer dependent just on inventory, demand, and order management signals. A manufacturer should be able to understand data describing global weather patterns and how it impacts their supply chains. Businesses need to pull in data from as many social sources as possible to understand how customer sentiment impacts product sales and corporate brands. Our customers need a data platform that ensures all this data works together seamlessly and easily. 04:38 Lois: So, you’re saying Oracle APEX is the platform that helps businesses manage and integrate data seamlessly. But data is just one part of the equation, right? Then there’s AI. How are the two related?  Chaitanya: Before we start talking about Oracle AI, let's first talk about what customers are looking for and where they are struggling within their AI innovation. It all starts with data. For decades, working with data has largely involved dealing with structured data, whether it is your customer records in your CRM application and orders from your ERP database. Data was organized into database and tables, and when you needed to find some insights in your data, all you need to do is just use stored procedures and SQL queries to deliver the answers. But today, the expectations are higher. You want to use AI to construct sophisticated predictions, find anomalies, make decisions, and even take actions autonomously. And the data is far more complicated. It is in an endless variety of formats scattered all over your business. You need tools to find this data, consume it, and easily make sense of it all. And now capabilities like natural language processing, computer vision, and anomaly detection are becoming very essential just like how SQL queries used to be. You need to use AI to analyze phone call transcripts, support tickets, or email complaints so you can understand what customers need and how they feel about your products, customer service, and brand. You may want to use a data source as noisy and unstructured as social media data to detect trends and identify issues in real time.  Today, AI capabilities are very essential to accelerate innovation, assess what's happening in your business, and most importantly, exceed the expectations of your customers. So, connecting your application, data, and infrastructure allows everyone in your business to benefit from data. 07:32 Raise your game with the Oracle Cloud Applications skills challenge. Get free training on Oracle Fusion Cloud Applications, Oracle Modern Best Practice, and Oracle Cloud Success Navigator. Pass the free Oracle Fusion Cloud Foundations Associate exam to earn a Foundations Associate certification. Plus, there’s a chance to win awards and prizes throughout the challenge! What are you waiting for? Join the challenge today by visiting oracle.com/education. 08:06 Nikita: Welcome back! So, let’s focus on AI across the Oracle Cloud ecosystem. How does Oracle bring AI into the mix to connect applications, data, and infrastructure for businesses? Chaitanya: By embedding AI throughout the entire technology stack from the infrastructure that businesses run on through the applications for every line of business, from finance to supply chain and HR, Oracle is helping organizations pragmatically use AI to improve performance while saving time, energy, and resources.  Our core cloud infrastructure includes a unique AI infrastructure layer based on our supercluster technology, leveraging the latest and greatest hardware and uniquely able to get the maximum out of the AI infrastructure technology for scenarios such as large language processing. Then there is generative AI and ML for data platforms. On top of the AI infrastructure, our database layer embeds AI in our products such as autonomous database. With autonomous database, you can leverage large language models to use natural language queries rather than writing a SQL when interacting with the autonomous database. This enables you to achieve faster adoption in your application development. Businesses and their customers can use the Select AI natural language interface combined with Oracle Database AI Vector Search to obtain quicker, more intuitive insights into their own data. Then we have AI services. AI services are a collection of offerings, including generative AI with pre-built machine learning models that make it easier for developers to apply AI to applications and business operations. The models can be custom-trained for more accurate business results. 10:17 Nikita: And what specific AI services do we have at Oracle, Chaitanya?  Chaitanya: We have Oracle Digital Assistant Speech, Language, Vision, and Document Understanding. Then we have Oracle AI for Applications. Oracle delivers AI built for business, helping you make better decisions faster and empowering your workforce to work more effectively. By embedding classic and generative AI into its applications, Fusion Apps customers can instantly access AI outcomes wherever they are needed without leaving the software environment they use every day to power their business. 11:02 Lois: Let’s talk specifically about APEX. How does APEX use the Gen AI and machine learning models in the stack to empower developers. How does it help them boost productivity? Chaitanya: Starting APEX 24.1, you can choose your preferred large language models and leverage native generative AI capabilities of APEX for AI assistants, prompt-based application creation, and more. Using native OCI capabilities, you can leverage native platform capabilities from OCI, like AI infrastructure and object storage, etc. Oracle APEX running on autonomous infrastructure in Oracle Cloud leverages its unique native generative AI capabilities tuned specifically on your data. These language models are schema aware, data aware, and take into account the shape of information, enabling your applications to take advantage of large language models pre-trained on your unique data. You can give your users greater insights by leveraging native capabilities, including vector-based similarity search, content summary, and predictions. You can also incorporate powerful AI features to deliver personalized experiences and recommendations, process natural language prompts, and more by integrating directly with a suite of OCI AI services. 12:38 Nikita: Can you give us some examples of this? Chaitanya: You can leverage OCI Vision to interpret visual and text inputs, including image recognition and classification. Or you can use OCI Speech to transcribe and understand spoken language, making both image and audio content accessible and actionable. You can work with disparate data sources like JSON, spatial, graphs, vectors, and build AI capabilities around your own business data. So, low-code application development with APEX along with AI is a very powerful combination. 13:22 Nikita: What are some use cases of AI-powered Oracle APEX applications?  Chaitanya: You can build APEX applications to include conversational chatbots. Your APEX applications can include image and object detection capability. Your APEX applications can include speech transcription capability. And in your applications, you can include code generation that is natural language to SQL conversion capability. Your applications can be powered by semantic search capability. Your APEX applications can include text generation capability. 14:00 Lois: So, there’s really a lot we can do! Thank you, Chaitanya, for joining us today. With that, we’re wrapping up this episode. We covered Oracle APEX, the key challenges businesses face when it comes to AI innovation, and how APEX and AI work together to give businesses an AI edge.  Nikita: Yeah, and if you want to know more about Oracle APEX, visit mylearn.oracle.com and search for the Oracle APEX: Empowering Low Code Apps with AI course. Join us next week for a discussion on AI-assisted development in Oracle APEX. Until then, this is Nikita Abraham… Lois: And Lois Houston signing off! 14:39 That’s all for this episode of the Oracle University Podcast. If you enjoyed listening, please click Subscribe to get all the latest episodes. We’d also love it if you would take a moment to rate and review us on your podcast app. See you again on the next episode of the Oracle University Podcast.

8 Huhti 15min

Raise Your Game with Oracle Cloud Applications

Raise Your Game with Oracle Cloud Applications

In this special episode of the Oracle University Podcast, Bill Lawson and Nikita Abraham chat with Peter Fernandez, Senior Director of Cloud Certification at Oracle University, about the exciting new Raise Your Game challenge. They discuss how the initiative is designed to enhance participants' skills in Oracle Fusion Cloud Applications and Oracle Cloud Success Navigator. They also cover key details about the challenge, such as how to get started, who can participate, the way it is structured, and the prizes up for grabs.   Raise Your Game: https://education.oracle.com/raise-your-game-saas Oracle University Learning Community:  https://education.oracle.com/ou-community LinkedIn:  https://www.linkedin.com/showcase/oracle-university/ X: https://x.com/Oracle_Edu   Special thanks to Arijit Ghosh, David Wright, Kris-Ann Nansen, and the OU Studio Team for helping us create this episode.   ------------------------------------------------------------------   Episode Transcript: 00:00 Welcome to the Oracle University Podcast, the first stop on your cloud journey. During this series of informative podcasts, we’ll bring you foundational training on the most popular Oracle technologies. Let’s get started! 00:25 Bill: Welcome to the Oracle University Podcast. I’m Bill Lawson, Senior Director of Cloud Applications Product Management with Oracle University, and with me is Nikita Abraham, Team Lead of Editorial Services. Nikita: Hi everyone! Last week, we concluded our three-part series on multicloud, and today, we’re shifting gears and exploring an exciting new challenge that’s been thrown down by Oracle University. To tell us all about it, we have Peter Fernandez joining us. Peter is Senior Director of Cloud Certification at Oracle University. Hi Peter! We’re thrilled to have you with us today! Peter: Hi Niki, hi Bill! I’m delighted to be here. 01:02 Bill: So, Peter, let’s get straight into it. What’s this new challenge all about?  Peter: The challenge, which we’re calling Raise Your Game, is an incredible opportunity for anyone looking to gain knowledge and gain professional skills about Oracle’s Fusion Cloud Applications. We launched a skills challenge on Feb 14, and it will continue until May 15, 2025. This challenge encourages you to build expertise in two key areas: Oracle Fusion Cloud Applications and Oracle Cloud Success Navigator.  This training is geared towards anyone who could be a student in higher ed or someone pursuing a business degree, and Oracle customers and partners who are new to Oracle’s Applications or experienced consultants implementing business applications.  01:55 Nikita: And how exactly does the challenge help in building this expertise? Peter: The challenge has two levels. In Level 1, you’ll need to complete an Oracle Fusion Cloud Apps Foundations course and pass the corresponding exam. These courses are designed to deepen your understanding of the technology enablers in Oracle’s Fusion Cloud Applications and learn about Oracle’s Modern Best Practice, or OMBP. These are extremely helpful throughout all phases in the journey when implementing and using Oracle Fusion Cloud Applications. The Foundation training itself covers a wide range of topics, including core OMBP processes, key performance metrics, implementation considerations, and technology enablers like AI, ML, mobile, and analytics.  02:49 Bill: Before we move on, Peter, can you tell us more about Oracle Modern Best Practice? We discussed it a few weeks back, but for anyone who missed that episode, it’ll be nice to get a quick refresher. Peter: Sure, Bill. Implementing Oracle Fusion Cloud Applications successfully is more than just technology—it’s about following best practices that drive efficiency and success and tie back to business requirements.  Oracle Modern Best Practice represent years of accumulated experience, industry insights, and proven methodologies. It serves as a guiding framework for implementing efficient business processes within Oracle Fusion Cloud Applications. These best practices map the features and innovations within Oracle applications to the processes that customers perform every day, and that is key.  These curated, industry-leading practices detail how the features that we have built using the most modern technologies can be leveraged to optimize operations.  Having a solid grasp of an OMBP and its associated technology enablers will empower you to ensure smoother business operations and higher customer satisfaction. It will show you how to automate activities, streamline tasks, improve results, and set your team up for continued success.  The goal of these courses is to make it easy for implementers, global process owners, IT teams to identify every opportunity to improve an organization’s business processes with Oracle Fusion Cloud Applications.  04:33 Bill: So, getting back to Level 1, what do I earn when I complete it? Peter: When you complete this level, you’ll earn a Level 1 Oracle University Learning Community badge. This recognizes that you have foundational knowledge in your chosen Fusion application. 04:48 Bill: That sounds exciting. And then there’s a Level 2? Peter: There is also a Level 2 and where things get even more exciting. You’re going to take your knowledge to the next level by completing the Oracle Cloud Success Navigator Essentials course and passing the associated assessment.  This level in the challenge focuses on particularly applying the knowledge you gained in Level 1 where you’ll explore the Oracle Cloud Success Navigator's features and functionality, and get the skills you need to lead organizations through their Oracle Fusion Cloud Applications implementation journey.  05:25 Nikita: And when I complete Level 2, I earn another badge? Peter: That’s right, Niki. When you successfully complete Level 2, you’ll earn a special Level 2 Oracle University Learning Community badge.  The goal of Raise Your Game is to reach the Summit by completing both Level 1 and Level 2 challenges with the fastest time and the highest pass scores. Both these combined determine your position on the leaderboard and your position in the Top 500, which will be awarded separate prizes at the end of the challenge.  05:59 Nikita: So, when you’re done, you’ll have both theoretical and practical knowledge. And I understand that there are some fantastic prizes up for grabs? Peter: Absolutely, Niki.  This not only helps with both theoretical but also practical knowledge. Learners also have a chance to be featured on the leaderboard in the Oracle University Learning Community. The leaderboard showcases the people who have achieved Level 1 and Level 2 with the fastest times and the highest scores. Along with the badges I told you about, at the end of the promotion, the top 500 people who complete both Level 1 and Level 2 with the fastest time and highest pass scores will receive an Oracle-branded cap, an Oracle Success Navigator pin, and a special Oracle University Community Success Navigator digital badge.  06:52 Bill: So, Peter, who can participate in this challenge, and are there any prerequisites? Peter: The challenge is open to anyone interested in expanding their knowledge of Oracle Cloud Applications. And while there are no strict prerequisites, a basic understanding of business concepts and some familiarity with Oracle Cloud Applications is recommended.  This will ensure that you’re able to make the most of the learning materials and engage with the content effectively. You can always check the program overview on the website if you have more questions about this challenge. We’ve got an FAQ posted there that should answer most anything you are curious about. 07:31 Bill: That’s good to know, Peter. And the fact that I get started no matter my level of experience is great news, too. Peter: Absolutely, Bill. Even if you are a beginner fresh out of college, or a seasoned pro like I mentioned earlier who has been implementing Oracle Fusion Cloud Applications (or other applications) for years, I would recommend the challenge and training to you.  Basically, this training is for everyone. This program provides foundational knowledge to improve the implementation approach using Oracle Modern Best Practices. Even those individuals that are certified in Cloud Applications will benefit from learning how these modern best practices fit into their work.  08:12 Nikita: Ok, Peter, I’m ready to do it. How do I get started with the challenge? Peter: That’s great. The first step, of course, is to register. And you can do this by visiting oracle.com/education. That's Oracle's main site. oracle.com/education. And select the first tile that you'll see on the webpage, which is the Raise Your Game challenge. If you don't already have an Oracle MyLearn account, you'll need to create one and you'll be prompted to create one. This account gives you access to the Oracle MyLearning platform. Once you're registered, you'll have access to a curated list of learning paths and corresponding certifications. It's important that you review the official rules and promotion details before proceeding with the challenge.  09:12 Unlock the power of AI Vector Search with our new course and certification. Get more accurate search results, handle complex datasets easily, and supercharge your data-driven decisions. From now to May 15, 2025, we are waiving the certification exam fee (valued at $245). Visit mylearn.oracle.com to enroll. 09:40 Nikita: Welcome back! Ok, Peter, I’ve registered. What’s next? Peter: After you're done registering, you need to select the Cloud Application course that aligns with your interests and goals. We have courses on four different areas: Human Capital Management, Enterprise Resource Planning, Supply Chain Management, and Customer Experience. Once you complete the training and certification, you're done with Level 1.  For Level 2, we have the Oracle Cloud Success Navigator Essentials course and assessment that you will need to complete. You can check your status on the leaderboard in the Oracle University Learning Community and share your progress on social media. Like I was saying, the time taken to complete each of these levels and the higher scores earned determines the Top 500 winners.  10:36 Bill: And the best part of this challenge is that it's completely free, right? Peter: Absolutely. There is no cost associated with participating in the skills challenge. It is completely free for anyone, anywhere in the world to participate as long as they comply with the official rules of the promotion. You can take any or all of the Foundation Associate certification exams at no cost. With multiple free attempts, there is no time limit for completing the exams, but to be eligible for the prizes, you must complete the exams and assessments by May 15, 2025. That's midnight GMT. 11:16 Nikita: What if someone doesn't pass the certification exam on their first attempt? Peter: If someone does not pass the certification exam on their first attempt, we understand that not everyone does. We've made provisions for that. If you don't pass the foundations associate certification exam, you have the option to retake the exam many times over. 11:37 Nikita: Now, Peter, let's say someone has already registered for the Fusion Cloud Applications Foundations Associate certification exam before joining the skills challenge. Will their exam be considered for the prizes? Peter: Well, that's a great question, Niki. If someone has already registered for the exam before joining the challenge, their exam will be considered for the prizes as long as they first join the skills challenge. This ensures that everyone who engages with the challenge has a fair chance to win. 12:06 Bill: Does course content consumed before the start of the challenge count towards the awards and badges? Peter: Unfortunately no, Bill. Any content consumed or purchased before Feb 14, 2025, that's again 12 AM GMT, does not apply retroactively to awards or prizes in the Raise Your Game challenge. We want everyone to start on an equal footing here. 12:29 Nikita: What about certifications earned before the challenge began? Peter: Again, certifications earned before Feb 14, 2025, again 12 AM GMT, do not qualify for the promotion. That ensures again that the challenge is fair for all participants. 12:48 Nikita: Now, Peter, how many free exam attempts do participants get as part of the challenge? Peter: Since all the Oracle Fusion Cloud Applications Foundations Associate Certification exams are free, there is no limit to the number of attempts. Participants can take these exams as many times as they need to. 13:05 Bill: And, Peter, say I want to take more than one of the Foundations courses and exams. Can I do that? Peter: Absolutely. This is a great way for someone to learn about the different areas of business that they may be familiar with. As I mentioned earlier, the Oracle Fusion Cloud Applications Foundations training is a program to provide you with knowledge of OMBPs, Oracle Modern Best Practices, that is, and Fusion Cloud Applications. So, it's a great opportunity to cross-skill. You can earn all four certifications if you choose. 13:38 Bill: Peter, thank you so much for joining us today and telling us all about this challenge. It is a really fantastic opportunity for everyone, whether you’re new to Fusion Cloud Applications or an experienced implementation professional, to boost your Oracle Cloud Apps expertise. We’re really excited to try it out ourselves! Peter: A sincere thank you to you, Bill and Niki. It's been an absolute pleasure. I'd really encourage everyone to jump on this challenge. It's a great way to enhance your learning journey and have some fun along the way. Nikita: I couldn’t agree more! Thanks Peter. That’s a wrap on this episode. Join us next week for another episode of the Oracle University Podcast. Until then, this is Nikita Abraham… Bill: And Bill Lawson, signing off! 14:19 That’s all for this episode of the Oracle University Podcast. If you enjoyed listening, please click Subscribe to get all the latest episodes. We’d also love it if you would take a moment to rate and review us on your podcast app. See you again on the next episode of the Oracle University Podcast.

1 Huhti 14min

Oracle Database@Azure

Oracle Database@Azure

The final episode of the multicloud series focuses on Oracle Database@Azure, a powerful cloud database solution. Hosts Lois Houston and Nikita Abraham, along with Senior Manager of CSS OU Cloud Delivery Samvit Mishra, discuss how this service allows customers to run Oracle databases within the Microsoft Azure data center, simplifying deployment and management. The discussion also highlights the benefits of native integration with Azure services, eliminating the need for complex networking setups.   Oracle Cloud Infrastructure Multicloud Architect Professional: https://mylearn.oracle.com/ou/course/oracle-cloud-infrastructure-multicloud-architect-professional-2025-/144474 Oracle University Learning Community: https://education.oracle.com/ou-community LinkedIn: https://www.linkedin.com/showcase/oracle-university/ X: https://x.com/Oracle_Edu   Special thanks to Arijit Ghosh, David Wright, Kris-Ann Nansen, and the OU Studio Team for helping us create this episode.   ---------------------------------------------------------------   Episode Transcript: 00:00 Welcome to the Oracle University Podcast, the first stop on your cloud journey. During this series of informative podcasts, we’ll bring you foundational training on the most popular Oracle technologies. Let’s get started! 00:25 Lois: Hello and welcome to the Oracle University Podcast! I’m Lois Houston, Director of Innovation Programs with Oracle University, and with me is Nikita Abraham, Team Lead: Editorial Services. Nikita: Hi everyone! For the last two weeks, we’ve been talking about different aspects of multicloud. In the final episode of this three-part series, Samvit Mishra, Senior Manager of CSS OU Cloud Delivery, joins us once again to tell us about the Oracle Database@Azure service. Hi Samvit! Thanks for being here today. Samvit: Hi Niki! Hi Lois! Happy to be back. 01:01 Lois: In our last episode, we spoke about the strategic partnership between Oracle and Microsoft, and specifically discussed the Oracle Interconnect for Azure.  Nikita: Yeah, and Oracle Database@Azure is yet another addition to this partnership. What can you tell us about this service, Samvit? Samvit: The Oracle Database@Azure service, which was made generally available in 2023, runs right inside the Microsoft Azure data center and uses Azure networking. The entire Oracle Cloud Database Service infrastructure resides in the Azure data center, while it is managed by an expert Oracle Cloud Infrastructure operations team.  It provides customers simple and secure access to Oracle Cloud database services within their chosen Azure deployment region, without getting into the complexity of managing networking between the cloud vendors. It is natively integrated with various Microsoft Azure services. This provides a seamless user experience when configuring and using the different Azure services with OCI Oracle database, since much of the complexity associated with the configuration is greatly simplified.  There is no need to set up a private interconnect between Microsoft Azure and OCI because the service itself resides within the Azure data center and uses the Azure network. This is very beneficial in terms of strategic deployment because customers can experience microseconds network latency between the endpoints, while receiving a high-performance database environment.  02:42 Nikita: How do I get started with the Oracle Database@Azure service? Samvit: You begin by purchasing the subscription from Oracle and setting up your billing account. Then you provision the database, resources, and service.  With that you are ready to configure your application to connect to the database and work on the remaining deployment. As you continue using the service, you can monitor the different resource metrics using the Azure monitoring services and analyze those logs using Azure Log Analytics.  03:15 Lois: So, the adoption is pretty easy, then. What about the responsibilities? Who is responsible for what? Samvit: The Oracle Cloud operations team is entirely responsible for managing the Exadata Database Infrastructure and the VM cluster resources that are provisioned in the Microsoft Azure data center.  Oracle is responsible for maintaining the service software and infrastructure by applying updates as they are released. Any issues arising from the OCI Database Service and the resources will be addressed by Oracle Support. You have to raise a support ticket for them to investigate and provide a resolution.  And as Azure customers, you have to do rightsizing, based on your workload needs, and provision the Exadata Database Infrastructure and VM cluster in the OCI pod within the Azure data center. You have to provision the database in Exadata Database Service, apply the database and system updates, and take advantage of the cloud automation to maintain and manage the database.  You have to load data, establish the connectivity, and support development on your database. As a customer, you monitor the database and infrastructure metrics and events, and also analyze those logs using the Microsoft Azure-provided native services.  04:42 Nikita: Samvit, what sort of challenges were being faced by customers that necessitated the creation of the Oracle Database@Azure service?  Samvit: A common deployment scenario in customer environments was that a lot of critical applications, which could be packaged applications, in-house applications, or customized third-party applications, used Oracle Database as their primary database solution.  These Oracle databases were deployed in Exadata Infrastructure on-premises or even in Enterprise Server hardware. Some customers evaluated and migrated many of their packaged and other applications to Microsoft Azure compute. Since Oracle Exadata was not supported in Azure, they had to configure a hybrid deployment in order to use Oracle databases that reside in the Exadata infrastructure on-premises.  They needed to configure a dedicated and secure network between the Azure data center and their on-premises data center. This added complexity, incurred high costs, had a latency effect, and was even unreliable. There were also cases where customers migrated Oracle databases on Enterprise Server on-premises to Oracle databases hosted on Azure compute.  This did not boost efficiency to a large scale. And those were the only options available when provisioning Oracle Database in Azure because Exadata was not available earlier in Azure. 06:18 Lois: And how has that been resolved now? Samvit: With the Oracle Database@Azure service, customer requirements have been aptly met by allowing them to host their Oracle databases on Exadata infrastructure, right next to their application in the Azure data center.  Customers, while migrating their applications to Azure compute, can also migrate their Oracle databases on-premises on Exadata infrastructure directly to Exadata Database Service in Azure. And Oracle databases that are on Enterprise Server on-premises can be consolidated directly into Exadata Database Service in Azure, providing them the benefits of scalability, security, performance, and availability, all that are inherent property of OCI Oracle Exadata Database Service.  Customers can see growth in the operational efficiency, saving on the overall cost.  07:17 Nikita: Can you take us through the process of deployment?  Samvit: It’s quite simple, actually. First, you deploy the Exadata Database Service that is plugged into Azure VNET. Next, you provision the required number of databases, which might be migrated as is or with a consolidated exercise.  You can use any of the Oracle database tools or utilities to do the migration or even use the Oracle Zero Downtime Migration method to automate the entire Oracle database migration. Finally, migrate your enterprise application into the Azure environment.  Establish the required network configuration to allow communication between the migrated applications and Oracle databases.  And then you are all set to publish your application that is running entirely in Azure. You can leverage other Azure services, like monitoring, log analytics, Power BI, or DevOps tools, to enhance existing or even build and deploy newer enterprise applications that are powered by OCI Oracle Database Service in the back end.  08:25 Lois: What about multi-cloud deployment scenarios where applications reside in Azure, but the Oracle databases are deployed on third-party cloud providers, either as a native solution or in computes? Samvit: These Oracle databases can be migrated to Exadata Database Service in the Oracle Database@Azure service.  There is no need for the complex cross-cloud connectivity setup between the vendors. And at the same time, you experience the lowest latency between the application and the database deployment. 09:05 Want to learn how to design stunning, responsive enterprise applications directly from your browser with minimal coding? The new Oracle APEX Developer Professional learning path and certification enables you to leverage AI-assisted development, including generative AI and Database 23ai, to build secure, scalable web and mobile applications with advanced AI-powered features. From now through May 15, 2025, we’re waiving the certification exam fee (valued at $245). So, what are you waiting for? Visit mylearn.oracle.com to get started today. 09:45 Nikita: Welcome back! Samvit, what’s the onboarding process like? Samvit: You have to complete the onboarding process to use the service in Microsoft Azure. But before you do that, you first have to complete the subscription process. You must have an active Microsoft Azure account subscription that will be used for subscribing and onboarding the Oracle Database@Azure service.  To subscribe to Oracle Database@Azure, you need to purchase an Oracle Database@Azure private offer from Azure Marketplace. As a customer, you will first reach out to Oracle Sales and negotiate a price for the service. Oracle will provide you with the billing account ID and contact details of the person within the organization who will be handling the service.  After this, Oracle will create a private offer in Azure Marketplace.  10:40 Lois: Sorry to interrupt you, but what’s a private offer? Samvit: That’s alright, Lois. Private offers are basically solutions or services created for customers by a Microsoft partner, which, in this case, is Oracle. Purchase of those private offers happens from the private offer management page of Azure Marketplace.  But there is a prerequisite. The Azure account must be enabled to make private offer purchases on the subscription from Azure Marketplace. You can refer to the Azure documentation to enable the account, if it is not enabled. You review the offer terms and accept the purchase offer, which will take you to the Create Oracle Subscription page.  You validate the subscription and other particulars and proceed with the process. After the service is deployed, the purchase status of the private offer changes to subscribed.  There are a few points to note here. Billing and payment are done via Azure, and you can use Microsoft Azure Consumption Commitment.  You can also use your on-premises licenses with the Bring Your Own License option and the Unlimited License Agreements to pay towards your service consumption. And you also receive Oracle Support rewards for every dollar spent on the service.  12:02 Nikita: OK, now that I’m subscribed, what’s next? Samvit: After you complete the subscription step, Oracle Database@Azure will appear as an Azure resource, just like any other Azure service, and you can move on to onboarding. Onboarding begins with the linking of your OCI account, which will be used for provisioning and managing database resources.  The account is also used for provisioning infrastructure and software maintenance updates for the database service. You can either provide an existing OCI account or create a new one. Then you set up Identity Federation between the Azure account and the OCI tenancy.  This can authenticate login to the OCI portal using Azure credentials, which you require while performing certain operations in OCI. For example, provisioning databases, getting infrastructure and software maintenance updates, and so on. This is an optional step, but it is recommended that you complete the Federation.  The last step is to authorize users by assigning groups and roles in order to have the needed privileges to perform different operations. For example, some groups of users can manage Exadata Database Service resources in Azure, while some can manage the databases in OCI.  You can refer to OCI documentation to get detailed descriptions of roles and group names. 13:31 Lois: Right. That will ensure you assign the correct permissions to the appropriate users. Samvit: Exactly. Assigning the correct roles and permissions to individuals inside the organization is a necessary step for transacting in the marketplace and guaranteeing a smooth purchasing experience. Azure Marketplace uses Azure Role-Based Access Control to enable you to acquire solutions certified to run on Azure. Those are then going to determine the purchasing privileges within the organization.  14:03 Nikita: There’s so much more we can discuss about Oracle Database@Azure, but we have to stop somewhere! Thank you so much, Samvit, for joining us over these last three episodes. Lois: Yeah, it’s been great to have you, Samvit. Samvit: Thank you for having me. Nikita: Remember, we also have the Oracle Database@Google Cloud service. So, if you want to learn about that, or even if you want to dive deeper into the topics we covered today, go to mylearn.oracle.com and search for the Oracle Cloud Infrastructure Multicloud Architect Professional course.  Lois: There are a lot of demonstrations that you’ll surely find useful. Well, that’s all we have for today. Until next time, this is Lois Houston… Nikita: And Nikita Abraham, signing off! 14:43 That’s all for this episode of the Oracle University Podcast. If you enjoyed listening, please click Subscribe to get all the latest episodes. We’d also love it if you would take a moment to rate and review us on your podcast app. See you again on the next episode of the Oracle University Podcast.

25 Maalis 15min

Suosittua kategoriassa Koulutus

rss-murhan-anatomia
psykopodiaa-podcast
voi-hyvin-meditaatiot-2
rss-vegaaneista-tykkaan
aamukahvilla
rss-valo-minussa-2
rss-narsisti
adhd-podi
psykologia
rss-duodecim-lehti
adhd-tyylilla
jari-sarasvuo-podcast
rss-vapaudu-voimaasi
aloita-meditaatio
rss-tripsteri
queen-talk
rss-koira-haudattuna
rss-laadukasta-ensihoitoa
puhutaan-koiraa
dear-ladies