The Competition for Agentic Platforms

Imagine a world where routine tasks are handled immedately, complex processes are optimized in real-time, and personalized experiences are delivered at scale by employees that work 24/7. This is the promise of “AI agents”, intelligent software programs that can act autonomously or assist humans in achieving specific goals, one step up from LLM’s that answer questions that will take actions, send emails, write proposals, update spreadsheets. Others have reffered to these agents as “Digital Employees”. This report delves into this competitive landscape of the Agentic AI race, and the players wanting to win your business in 2025. Proprietary Platforms? Or Interconnected? Salesforce CEO Marc Benioff boldly stated that today’s CEOs will be the last to lead all human workforces, proclaiming that “AI agents are here and they’re taking over more work at the office”  While Salesforce is pushing its own vision of an AI future, where their platform becomes the central hub for all your business processes and data, and would require all your data to be in Salesforce, Google is taking a different approaches by letting you integrate their agents with 3rd party tools with seamlessly integrating with Google services. WHich sounds more attractive to us than dumping all your data into Salesforce. Tech Giants Building AI Agent Ecosystems Several tech giants are actively developing platforms and ecosystems to manage and deploy AI agents. These companies are leveraging their existing infrastructure, expertise, and resources to establish a dominant position in this burgeoning market. Google Google is positioning itself as a leader in the AI agent space with its Google Agentspace platform. Agentspace offers a workspace where employees can access information, perform complex tasks, and receive proactive suggestions from AI agents tailored to their specific needs.  It combines Gemini AI, enterprise search, and custom AI agents to offer businesses a seamless and intelligent way to manage tasks, automate processes, and drive efficiency Google’s emphasis on customization. Oracle Lary Ellison wants to put all of the US governments data into Oracle cloud to get insights from it, the strategy here sounds similar to Salesforce, centred around (and not suprisingly for Oracle) a fully proprietry and closed ecosystem. Microsoft Microsoft’s strategy appears to be centered around deeply integrating AI agents within its existing product ecosystem, making them readily accessible to users of its popular productivity and business applications. Amazon Amazon’s approach combines a focus on practical, ready touse AI agent solutions with investments in long term research, indicating they are a bit behind the other two hyperscalers. SAP SAP’s are focused on leveraging AI agents to enhance its core ERP offerings, automating complex business processes and providing intelligent assistance to users. Like Salesforce this requires you to use SAP for everything. Beyond the Titans While the tech giants command significant attention, every second startup and specialized companies are also emerging in the AI agent space. These companies are developing innovative solutions that cater to specific needs and industries, often with a focus on agility and customization. Investors agree this is a gold mine with some big investments, including: Consulting is getting in on this as well, at Aviato we have developed an AI agent that can refactor Java code to Javascript, completing 80% of the work in mere hours. While developers are still needed to get the last 20% this kind of agent will reduce 2 year projects to months. Conclusion The race to become the central hub for AI agents is just getting started, we are excited, as a Google partner we believe that Google Agentspace integrating with both Google 1st party, and multiple 3rd Party tools is going to be the winner over the proprietary solutions at Microsoft, Oracle, Salesforce and AWS, the interconnected nature of data is going to need a way to connect all data sources and not mandate all data be migrated. In addition we think there is a great opportunity for our AI Agents to augment our consultants to provide more value to our clients in less time. 2025 is going to be an exciting year.

Service Extensions for Google Cloud App LB’s

If you run a website or app on Google Cloud and you’re using their Application Load Balancer to distribute traffic but wish you had a way to: Add or modify HTTP headers: Insert new headers for specific customers. Or re-write client headers on the way to the back end. Implement custom security rules: Add your own logic to block malicious requests or filter sensitive data. Perform custom logging: Log user-defined headers or other custom data into Cloud Logging or other tools. Rewrite HTML on the fly: Dynamically adjust your website content based on user location, device, etc Script Injection: Rewrite HTML for integration with Analytics or reCAPTCHA Traditionally, you’d have to achieve this by setting up separate proxy servers or modifying your app. With Service Extensions, you can do all this directly within your Application Load Balancer. How Service Extensions work: They are  mini apps: Service Extensions are written in WebAssembly (Wasm), these are super fast, and secure. They run at the edge: This means they on the load balancer, reducing any potential impact to latency. They’re fully managed: Google Cloud takes care of all the hard parts.  Why would anyone use Service Extensions? Flexibility: Tailor your load balancer to your specific needs without complex workarounds. Performance: Improve response times by processing traffic at the edge. Security: Enhance your security posture with custom rules and logic. Efficiency: Reduce operational overhead by offloading tasks to the load balancer. How to get started: Check the docs from Google, start with Service Extensions Overview then Plugins Overview and How to create a plugin finally some Code Samples Also definitely worth checking out WASM if you have not already at https://webassembly.org/ Service Extensions sit in the Cloud Load Balancing processing path.  Image to the left shows this.

Video Post: AI is not new

Transcript:So here’s the thing about AI it’s not in any way new most businesses have been using it behind the scenes for years to solve real world business problems things like fraud detection recommendation engine when you’re shopping and it’s suggesting similar products you might be interested in um optimizing Logistics all all different kinds of businesses have been optimized with AI it used to be called ml when I joined Google in 2018 there was a case study from 2016 of a Japanese cucumber farmer who built an AI model on tensorflow ran it on Raspberry pies to optimize his cucumber Farm his mom used to spend 8 hours a day sorting cucumbers and this guy was not a tech background but he developed a solution that would sort the Cucumbers had a motor and would and would sort them automatically saving his mom 8 hours every however often she sorted cucumbers I think when the llms came out there was a lot of realization in the consumer space that AI can achieve business goals but businesses aren’t really going to get any money from having a photo of a cat on the moon or something else like Chuck Norris fighting Dolphins um which is funny but not very profitable I think the good thing about the llm craze is that it’s brought attention from businesses to other ways that they can optimize to save money from Ai and if in 2016 a Japanese cucumber farmer can save 8 hours per day so an entire day of labor for one person with some AI there are definitely ways that most businesses can benefit from this it’s just a matter of uncovering them and then applying these AI solutions to them we’ve been working with a number of customers on things that will have incremental maybe double digit improvements but across the organization lots of these small projects will have a outsize impact across the organization and across the industry love to chat to people who have an issue they think would be solvable and see if we can help solve it thanks [Music]

Vendor Lock-in: We think its a myth.

The Myth Of Vendor Lock-in The cloud has revolutionized how businesses operate, but we often get stuck in weeks-long project delays trying to avoid vendor lock-in. This article highlights whether this is something you should be concerned about, or if your efforts are best focused elsewhere. I guess it is best to start on what vendor lock in actually is. Understanding Vendor Lock-in Vendor lock-in occurs when a customer becomes reliant on a specific vendor’s products or services, making it difficult or expensive to switch vendors.  The business risk here is usually either: That one vendor could raise prices, and you would be stuck paying the higher price (VMware/Broadcom comes to mind) Vendor has multiple outages, or poor support (VMware/Broadcom comes to mind) The vendor goes bankrupt, or is acquired by a competitor, and your business along with it The Cloud Hyperscaler Landscape Cloud hyperscalers like AWS, Azure, and Google Cloud have significantly mitigated the risks of vendor lock-in. Here’s why: Open Standards, Open Source, and Interoperability: Hyperscalers increasingly embrace open standards and APIs, Containers, and Kubernetes is one example with every cloud having multiple ways to run standard docker containers, and these can be moved between clouds, with no changes. Each cloud does have proprietary services, especially when we look at databases, but the effort to migrate and modify these is typically way lower than it has been in the past.  Using one of these databases to avoid vendor lock-in with AWS/GCP/Azure can also just mean you are locked into MongoDB, or an open source DB that is hard to move from. Bankruptcy: If any of these vendors does go bankrupt it will be a slow process, Google, Microsoft or Amazon are some of the wealthiest companies in the world, so I think we can discount this. Data Portability: Hyperscalers offer tools and services to simplify data migration and portability. While moving large datasets can still be complex, the process is becoming more manageable, hyperscalers will often fully or partially fund the migration from a competitor. In addition highly performant network connections between clouds are available or even physical devices to move the largest of datasets quickly. Market Competition: The intense competition among cloud hyperscalers drives down prices, there has only been a few times where some services increased in cost. This competition is not likely to reduce in the near term.  Mitigating Vendor Lock-in Concerns While the risks of vendor lock-in are lower with cloud hyperscalers, if this is a concern there are a few steps to mitigate the effort if you ever do need to migrate: Design for Portability: Architect applications and data structures with portability in mind from the outset Avoid Proprietary Services: Minimize reliance on vendor-specific databases that lack equivalents on other platforms Conclusion The cloud hyperscaler era has resulted in strong competition which has significantly diminished the concerns of vendor lock-in. Open standards, data portability, and market competition have allowed businesses to focus less on lock-in and more on transforming their business. While some level of lock-in will always exist, it is about choosing where you are locked in, if you go all open source, and build your own servers you will be locked in to using this stack. We believe the focus should shift from fearing vendor lock-in to strategically leveraging the cloud’s capabilities to drive innovation and business growth.

Getting Started with GCP is easy…..but not so fast.

Getting Started with GCP is easy…..but not so fast. Transcript Google makes it easy to get started with Google Cloud but at the expense of some of the controls that large Enterprises need to have when they’re running workloads on any public Cloud now Google do this so that developers can very easily get started if they made it really hard to start using Google Cloud people would use one of the other clouds that was a little bit easier to use however when you start putting production workloads on there that might have customers information in them you need to revisit that security and put some controls around it setting this up the right way is not hard Google even released the code to build all the infrastructure and put it on GitHub you can easily find it if you Google Fast fabric the first result will be GitHub result for Google Cloud’s Professional Services team where they’ve put that code that you can run and enforce all of their best practices for you now if you need help running this and it can be a little bit complex or if you want any advice on how to get started with it hit me up I’m always happy to talk about this kind of stuff thanks

? AI Just Got a HUGE Upgrade (And You Need to Know Why)

? AI Just Got a HUGE Upgrade (And You Need to Know Why) Transcript for all those AI nerds there’s been some pretty interesting announcements from Google number one anthro pics Claude 3 is now generally available on vertex AI Gemini Pro 1.5 and Gemini 1.5 flash are also generally available we’re over 700,000 models on hugging phe so you can use any of the models on hugging face with vertex AI for those not familiar hugging face is kind of like a repository like git lab but for AI models so people taking off the shelf models or creating their own um modifying them and then uploading them to hug phase the next thing that’s super interesting is context cing so you can use context cing with Gemini Pro 1.5 and Gemini 1.5 flash models and this lets you past some of the tokens that you have uploaded so if you have uploaded um video and you want to ask multiple questions about it you don’t need to upload that video each time which is obviously going to be charged you can upload it once and ask multiple questions same thing if you have chat Bots with very long instructions um or you’ve got a large amount of documents and you’re asking different queries around document um the final use case I think was interesting is if you have a code repository and you’re looking to fix a lot of bugs upload it once C that context and then can do a lot of careers against it reducing both the cost and the latency to get those insights um if you need help with any of this feel free to reach out always happy to have a CH thank you

? Is Your Google Cloud Bill Out of Control? ?

? Is Your Google Cloud Bill Out of Control? ? Transcript so you started using cloud and your costs keep growing and growing every month it seems to be more and more money than you’re spending on cloud and you’ve realised it’s time to take a look and cut those costs down to something that’s more sensible if you’re using Google Cloud they’ve got the fin ops Hub and the billing manager where you can go and see where these costs are broken down they’re often broken down by project so you can kind of see where some of the hot spots are and to reduce that the next thing you should start looking at is Devon test workloads and people pay me to come in and consult and say hey do you really need your development workloads running 24/7 when your developers are only working 9 to 5 it’s pretty logical get that turned off when it’s not in use even better get those running on spot instances these are substantially cheaper but when Google have low capacity they will take them away from you that will kill the developer workflow but it will ensure that your developers are writing code that can tolerate failures which is key to running anything on cloud the next thing you want to do is you want to enhance the visibility you’re getting into where you’re spending money now this is done with labels so every project or every resource that you have running should have a label on it with an owner and that owner should get an invoice not an invoice but a report at the end of each month showing how much money they’ve spent that will Empower your team to understand that they might be spending money that they don’t know about and have a look and see if they can reduce that by themselves this is really simple with Google creating labels putting them on everything and then exporting all the billing data into B crew so you can slice it dice it run reports and figure out where you need to focus your cost saving another few things that often get missed is Right sizing computer machines so being a computer engine you can individually change your memory and CPU to right size it to your workload now a lot of people do this as a one-time exercise and they kind of guess it they never come back and revisit it there’s tons of reports in Google where you can go through have a look at these things and then save yourself considerable money just by getting rid of unnecessary resources that your machines aren’t using if you donate anyone have a look at this feel free to reach out thank you

How do you know if AI is actually answering your question, and not just spitting out nonsense?

How do you know if AI is actually answering your question, and not just spitting out nonsense? Transcript so I’m going to break down a few Concepts you might have been hearing when people are talking about AI the first one is retrieval augmented generation or rag it’s a bit of amouthful but it’s really simple if you ask an LL question so chat GPT or Google’s Gemini it’s going to respond based on what it’s being trained on which is the context of the entire internet but nothing specific to your business rag solves this problem by taking your business data uploading it into a database so that when you ask a question the question can retrieve data from the database based on your business and then formulate a response that’s grounded in that this reduces hallucinations or llms making up nonsense and make sure that it’s using data that is real from your business now the other concept we have is chunking if we’re taking documents and uploading them into the database we don’t want to upload entire documents cuz we’re not going to send entire documents to the llm very expensive so we chunk this you could chunk via paragraph but sometimes you need the paragraph surrounding that paragraph to get the full context or you can chunk via headings now different things are going to work for different businesses depending on how your data is structured by fining the way we chunk and store that in a database so that the LM can retrieve it and swapping out the llm model we can optimize for your business making sure that you get the best results possible for the best price possible whenever a new llm is released we can also test that very rapidly and see if that’s going to give you better results or a better price if you’re interested in learning more about this feel free to reach out happy to have a chat with anyone on these subjects thanks

Video Post: ? Stop Wasting Money! ? Easy Cost Cutting for Your Business!

Getting Started With Google Cloud Transcript getting started with Google Cloud can seem overwhelming at first as with any cloud there are a lot of services that you can use and each has configuration options that can get you into trouble when I worked at Google Cloud I helped some of the biggest brands in Australia set up their cloud environments and I’ll give you a few tips that I learnt from doing that the first thing you wanna do is enable some structure trading an organisation and then creating folders in the organisation to keep projects organised and allow to give groups of users permissions to do things to those projects for example putting all the development projects in a folder called development and giving developers access to those and then having all of the production projects in a production folder maybe without access for developers or for other groups of people once you have the folder set up you need to set up identity and access management so as I kind of touched on that’s creating a group putting developers in the group and then giving that group access to the folder that contains the projects that developers need to work on to do their jobs we may not wanna give them access to the production folder at all or maybe we only give them read only access this is a super simple example and we can nest folders and get much more complex with it and any environment that we’re talking about is gonna have more complexity than that this is a simple explanation now we wanna start talking about organisational policies we’ve got a group of developers that got access to their projects and development folder that we still don’t wanna do anything silly like putting a cloud storage bucket on the internet so that anyone can see what our files are even if those are development mocked data having a data breach is not gonna be good in the headlines there’s a ton of all policies and each one of them needs to be configured and this one example appear for the cloud storage bucket we may need an exception for the public facing internet to be on the internet once we have all this set up we kinda wanna make sure that we’re managing with code if a developer does request that a cloud storage bucket be put on the internet we wanna see who requested that and why and track those changes the logical step here is using infrastructure as code we use Terraform the same as the best practice at Google Cloud that Google had professional services used when I was there and we can do the same for your business in 5 days excluding any complex networking some people are spending much longer on this and it’s really not that complex if this sounds too complex do reach out we’ve done this when working at Google so we know the best practices and we know how to set you up securely so that your business can scale on Google Cloud thank you

Using AI for Document Processing

Artificial intelligence (AI) has captured the world’s imagination with its impressive ability to generate human-like text and engage in conversations, often blurring the lines between human and machine. While these “cool” applications have gained widespread attention, their practical value beyond chatbots has remained somewhat elusive. However, one area where AI is quietly making waves is in the realm of document processing. AI agents, equipped with advanced natural language processing (NLP) capabilities, can read and understand thousands of words in mere seconds. This opens up a world of possibilities for streamlining and automating tasks that previously consumed countless hours of human labor. The potential to reduce the time spent on document processing is enormous. Consider the following fields: Legal: Lengthy contracts summarised, legal precedents found, and arguments summarised. Healthcare: Alayze records, review literature and research to support diagnosis, or simplify text for patient understanding. Finance: Analyse financial statements, reports, and filings to identify risks and inform investment decisions. Beyond these obvious industry specific use cases any organisation that is dealing with documents can benefit from some AI help to improve efficiency and reduce costs.   Enter Google Cloud Google Cloud Document AI uses advanced character recognition to extract data from your documents, creating highly accurate document processors to extract, classify and split documents. Googles highly scalable infrastructure can ingest your companies documents and analyze them instantly, this can be used for: Better understanding your customers: Information from clients SMS, Emails, and documents are often siloed, understanding all this data can be used to help gain better understanding of your customers and their behaviors. Reduce Fraud: Most fraudulent documents contain subtle issues that are often not noticeable to the human eye, but AI can detect these things (much like we can detect issues with AI generated images easily) reducing revenue lost to fraudulent documents. Report Writing: Hand writing documents has always been time consuming, and we typically rely on templates, AI takes this to the next level, writing entire reports based on data you have on your clients in seconds.   While the “cool” factor of AI chatbots may have captured our initial attention, the true value of AI lies in its ability to transform industries and improve our lives. As AI agents continue to evolve and mature, their impact on document processing and other fields will only grow, ushering in a new era of efficiency and productivity.

@2025 copyright by Aviato Consulting. All rights reserved