Generative AI in the Software Buying Process: Overhyped or Helpful?
Unless you’ve lived under a rock for the past few years, you’ve heard of generative AI tools like ChatGPT. The AI hype is everywhere, both within and outside the tech industry. AI tools are likely to change how workers approach many day-to-day tasks, including buying new software.
Let’s dive deeper into and expand upon our 2024 B2B Buying Disconnect Report analysis of software buyer attitudes toward generative AI and how it’s changing the software buying process.
We’ll dive into:
- What are generative AI tools?
- How are generative AI tools being used in the workplace?
- How much trust do people have in generative AI tools?
- The Trust Factor: Implications for generative AI tools in the software buying process
What are generative AI tools?
If you ask ChatGPT, “What are generative AI tools?,” you’ll likely get an output similar to the following:
“Generative AI tools are artificial intelligence systems designed to create new content, such as text, images, music, code, and more, based on input data. These tools leverage deep learning techniques, particularly generative models, to produce novel outputs that are often indistinguishable from human-created content. Here are some of the main types of generative AI tools:
- Text generation:
- GPT (generative pre-trained transformer): Models like OpenAI’s GPT-3 and GPT-4 generate human-like text for various applications, including chatbots, content creation, and language translation.
- BERT (Bidirectional Encoder Representations from Transformers): Though primarily used for understanding and processing language, it can also generate text.
- Image generation:
- GANs (generative adversarial networks): Models like StyleGAN and BigGAN create realistic images from random noise or specific inputs.
- VAEs (variational autoencoders): These models generate new images by learning latent representations of input data.
- DALL-E: A model by OpenAI that generates images from textual descriptions.
- Music and audio Generation:
- Jukedeck: Generates music tracks based on user preferences.
- OpenAI’s MuseNet: Composes music with various instruments and styles.
- Video Generation:
- Deep Video Portraits: Create realistic video renditions of human faces.
- Synthesia: Generates synthetic videos of people speaking in different languages.
- Code Generation
- GitHub Copilot: An AI pair programmer powered by OpenAI’s Codex, which suggests code snippets and entire functions based on the code context.
- Design and Art:
- DeepArt: Applies artistic styles to photos to create artworks.
- RunwayML: A platform offering various generative tools for creators and artists.
- 3D Model Generation:
- Nvidia’s GauGAN: Generates photorealistic images from simple sketches, useful for creating 3D models.
These tools are widely used in various industries, including entertainment, marketing, design, and software development, to enhance creativity, automate tasks, and produce high-quality content efficiently.”
As you can see, ChatGPT got a lot right. While it’s debatable whether or not outputs from generative AI tools are truly indistinguishable from human-created content, the factual information listed, such as use cases and examples of generative AI tools, is correct, though incomplete. Notably, this output focuses on GenAI tools created by OpenAI (the makers of ChatGPT) without diving into the capabilities of their direct competitors.
From the author’s personal experience, ChatGPT is generally pretty good at summarizing textual input. It’s also helpful for outlining content pieces and in the editing process. That said, these models are imperfect, and outputs generally need to be checked for accuracy and quality.
How are generative AI tools being used in the workplace?
We asked a sample of over 2,000 technology buyers whether or not they use generative AI tools at work. 66% of those surveyed indicated that they use generative AI tools at least sometimes and 48% said they use generative AI at least monthly.
It’s clear from this data that most people are at least open to the idea of incorporating generative AI into their workflows. The question now is, how are they utilizing these tools?
Usage of AI tools in the workplace is likely to vary heavily based on factors such as an individual’s role and/or industry. Organizational policies around the usage of AI tools vary and may rule out some use cases.
Denyse Drummond-Dunn, President of C3Centricity, was an early adopter of ChatGPT and primarily uses it for research and writing tasks. According to Denyse:
“I beta tested ChatGPT so have been using it for a few years now. I love it, especially now that it has less hallucinations and provides the sources of the information it uses in replies to prompts.
“It saves me time and I also use it as a source of inspiration to find other angles for my blog posts on customer centricity. I now write in a couple of hours what used to take me a couple of days to research and complete.”
Meanwhile, Rob Diamant, Senior Developer at Momentus Technologies, doesn’t use generative AI tools, saying that they’re inaccurate because “[I] can get different responses to the same question.”
Many of the professionals we surveyed indicated that generative AI (GenAI) is a good starting point for creative projects, but its outputs cannot be trusted implicitly. Nir Levy, Vice President, R&D, DevOps, and Product at Zoom Technologies, says, “[Gen AI is] helpful in getting things started but not very accurate. You really can’t trust anything that comes out of Gen AI, but it’s a good start [in] many cases.”
Beyond research tasks, many generative AI users utilize these tools to automate everyday administrative tasks, likening GenAI to a personal assistant. That doesn’t mean that GenAI has no place in more technical tasks. Along with several members of our buyer community, TrustRadius leverages generative AI tools as a part of our product engineering process.
“We are withnessing a paradigm shift in how software is created, engineered, and maintained. As a result, engineers at TrustRadius are leveraging AGI co-pilots to expedite the learning of new concepts, rapidly create product prototypes, generate quality checks, and provide options for large-scale system migrations. This has significantly decreased our overall time to market while also allowing engineers to focus on areas AGI is not well-suited for yet. Although there is an immense amount of leverage we get from these AGI capabilities, we anticipate greater value in the near future. Our teams are currently experimenting with more advanced agentic frameworks to rapidly scale our product engineering processes to deliver more value for our customers.”
Businesses across several industries are experimenting with and leveraging generative AI as a part of their workflows. However, using these tools doesn’t mean they’re perfect, and most folks aren’t ready to trust AI 100% just yet.
How much trust do people have in generative AI tools?
In our 2024 B2B Buying Disconnect report, we asked software buyers how much they trusted content created by generative AI tools. Unsurprisingly, most of the individuals surveyed do not implicitly trust content created by generative AI tools.
Buyers reported the following trust levels in content created by generative AI tools:
- Sometimes trust: 51%
- Rarely trust: 28%
- Never trust: 11%
- Mostly trust: 9%
When asked why they do or don’t trust content created by generative AI, there were a few common themes in the responses.
- The technology is too new to be trusted implicitly.
- Generative AI models are trained on information on the internet, and the internet isn’t always accurate.
- These tools don’t have the critical thinking skills of a human.
- These models are good at summarizing data, but those summaries can leave out personalized information.
- These tools are helpful to fill in gaps or to jump-start the research process, but they’re insufficient by themselves.
- Models used by generative AI tools have to be trained, and this takes time, so they can’t provide insights into recent events.
10% of participants in our survey said that they feel generative AI made their buying process more difficult. These buyers indicated they don’t fully trust AI-generated content because, as one buyer said, “It’s hard to differentiate what information was provided by a human and what was generated as filler.”
Beyond this, multiple buyers indicated that content created by generative AI lacked the user insights they were seeking to inform their buying process, stating, “Since AI pulls from generalizations and averages, I don’t know if it’s actually giving me the personalized opinion I need,” and “the details provided by AI are after zero use experience.”
The accuracy of the outputs and the lack of real-life experience are big concerns when utilizing content created with generative AI. One buyer even said, “I am worried it might pick up info from fake reviews or biased reviews.” We are too—which is why TrustRadius works to ensure that our reviews are written by real people with real product experience. Read more about how we fight fraud in our 2024 Review Quality Report.
It appears, for now, that buyers are still seeking out the human perspective on their software purchases. While some buyers may leverage generative AI tools as a starting point, 56% sought out a conversation with someone who had used the product they were researching before making a purchase decision.
The Trust Factor: Implications for generative AI tools in the software buying process
Using generative AI in the software buying process in 2024
As outlined above, generative AI tools have a variety of use cases that can help people work more efficiently. That said, these tools have a ways to go before they’re commonplace in the software buying process.
21% of buyers reported using generative AI tools as a part of their software buying process within the past year. A further 9% said they’d tried using generative AI tools, but they weren’t helpful. The vast majority of software buyers did not leverage generative AI at all in their research process.
This makes sense given that many of the buyers who said they sometimes or mostly trust content created by generative AI tools also said that they felt it was important to fact-check the outputs for various reasons. In some cases, this fact-checking might feel like an extra step that takes just as long as researching the products without generative AI would.
Predictions for the future
Currently, most buyers don’t feel that generative AI has any influence on their software buying process. When asked how generative AI impacted their buying process:
- 68% said it had no impact
- 20% said it was helpful
- 10% found it frustrating, saying that the proliferation of AI-based content made it harder for them to find the information they needed
These statistics are particularly interesting given that 78% of the buyers we surveyed said they use Google as a part of their research process. Given Google’s pivot to AI, including their rollout of AI overviews to SERP pages, AI content is likely to be inescapable in the software buying process moving forward.
Further, generative AI technology is not standing still. A big concern raised by the buyers we surveyed was that these models had to be retrained and may be working with outdated information. This is unlikely to be a concern in the future. By default, ChatGPT utilizes user inputs to train its models. It’s entirely possible OpenAI could utilize this data to personalize outputs to make them more relevant to each user.
There are also paid options that allow users to train large language models (LLMs) on their datasets, which would eliminate the concern cited above that the outputs are based on outdated training material. At this point, this process is probably too technical to be worth it for something like a single software purchase, but it stands to reason that companies such as OpenAI are working to make it easier for customers to leverage their own data within the product.
As these tools advance, we’ll likely see a wealth of new use cases that we haven’t yet considered. There’s no reason to believe that software buying won’t be one of them if they can solve for the trust factor.