Everyone in the community was surprised by ChatGPT last year, which a web service responded to any and all user questions with a surprising fluidity. ChatGPT is a variant of the powerful GPT-3 large language model created by OpenAI, a company owned by Microsoft. It is still a demo though it is pretty clear that this type of generative AI will be rapidly commercialized. Indeed Microsoft is embedding the generative AI in its Bing Search service, and Google is building a rival offering. So what are smaller businesses to do to ensure their messages are heard to these machine learning giants? For this latest podcast from The New Stack, we discussed these issues with Ryan Johnston, chief marketing officer for Writer. Writer has enjoyed an early success in generative AI technologies. The company's service is dedicated to a single mission: making sure its customers' content adheres to the guidelines set in place.
Everyone in the community was surprised by ChatGPT last year, which a web service responded to any and all user questions with a surprising fluidity.
ChatGPT is a variant of the powerful GPT-3 large language model created by OpenAI, a company owned by Microsoft. It is still a demo though it is pretty clear that this type of generative AI will be rapidly commercialized. Indeed Microsoft is embedding the generative AI in its Bing Search service, and Google is building a rival offering.
So what are smaller businesses to do to ensure their messages are heard to these machine learning giants?
For this latest podcast from The New Stack, we discussed these issues with Ryan Johnston, chief marketing officer for Writer. Writer has enjoyed an early success in generative AI technologies. The company's service is dedicated to a single mission: making sure its customers' content adheres to the guidelines set in place.
This can include features such as ensuring the language in the copy matches the company's own designated terminology, or making sure that a piece of content covers all the required topic points, or even that a press release has quotes that are not out of scope with the project mission itself.
In short, the service promises "consistently on-brand content at scale," Johnston said. "It's not taking away my creativity. But it is doing a great job of figuring out how to create content for me at a faster pace, [content] that actually sounds like what I want it to sound like."
For our conversation, we first delved into how the company was started, its value proposition ("what is it used for?") and what role that AI plays in the company's offering. We also delve a bit into the technology stack Writer deploys to offer these services, as well as what material the Writer may require from their customers themselves to make the service work.
For the second part of our conversation, we turn our attention to how other companies (that are not search giants) can get their message across in the land of large language models, and maybe even find a few new sources of AI-generated value along the way. And, for those public-facing businesses dealing with Google and Bing, we chat about how they should they refine their own search engine optimization (SEO) strategies to be best represented in these large models?
One point to consider: While AI can generate a lot of pretty convincing text, you still need a human in the loop to oversee the results, Johnston advised.
"We are augmenting content teams copywriters to do what they do best, just even better. So we're scaling the mundane parts of the process that you may not love. We are helping you get a first draft on paper when you've got writer's block," Johnston said. "But at the end of the day, our belief is there needs to be a great writer in the driver's seat. [You] should never just be fully reliant on AI to produce things that you're going to immediately take to market."
Alex Williams 0:08
You're listening to the new stack makers, a podcast made for people who develop, deploy and manage at scale software. For more conversations and articles, go to the new stack dot I O. All right now on with the show.
Joab Jackson 0:32
Hello, welcome to the latest edition of the new stack podcast. And this week we're going to discuss chat GPT. Like last year GPD surprised the world with the web service that responded to any and all user questions with a kind of surprising and sometimes frightening fluidity. Now GPT is saw demo it's a it's a variant of the powerful GPT three large language model, which was created by open AI, a company owned by Microsoft. And everybody had fun with GPT. But it's pretty clear that this sort of generative AI will have profound effects across the industry, not just chat GBT. But a lot of companies that that are working with these AI produced, or AI assisted technologies. For this podcast, we'll be speaking with a company that has already had some success. In this space, a company called writer, we will be joined by writers Chief Marketing Officer Ryan Johnston. And we'll discuss, you know, writers success and what writer and other companies are thinking that we customers as customers should be doing to prepare for this kind of new post search world that we might be in. All right. Thank you so much for joining us.
Ryan Johnston 1:57
Thanks for having me job. It's obviously an exciting topic an exciting time right now. So I'm happy to be here and looking forward to this conversation.
Joab Jackson 2:04
Indeed. Excellent, terrific. Now, a bit of a disclosure, Insight partners, which owns the new stack is an investor in Writer. So we should mention that. But let's jump right into it. How was writer founded? Or how did it come about?
Ryan Johnston 2:19
Yeah, so writer was founded in about early 2020. And it actually was a pivot. So prior to that writer had been another company called Cordova, which was in the kind of localization space. And so they were deeply interested in machine learning models, they were deeply interested in the transformer technology that was being developed around that time, machine learning engineers were really looking how to do machine translation. And that was really what was driving the innovation that was happening at transformers. And so the founding team of Cordova, now writer, were looking at all this innovation that was happening, they said, what if we were able to take transformers and use those to help everybody write better. And so this happened, where there's this clear kind of moment, wherever they could see into the future. And they can say transformers can really transform the way that we're going to be able to write as a business as an individual. Therefore, let's lean into that and help people create amazing content through great writing through the power of things like Transformers and artificial intelligence. So that's essentially where the founding and the pivot came from.
Joab Jackson 3:20
Nice, nice. And you know, the new stack is a small business. And we do we have lots of correspondence, we have lots of content. So it sounds like a pretty great service where you can kind of standardize make sure the language that you have is correct. What is this, the service writer offers or explain to us? What rider offers
Ryan Johnston 3:37
writer is a generative AI platform built for the needs of teams and businesses, unlike other AI products, writer is trained on a company's own data and their style and brand guidelines that they might bring to the table. And so this is all done securely. It's all done on data provided by the company. But what you can get as consistent on brand content as your output, whether or not that's written by a human or written by an AI initially. And so it's really solving major problem for companies to where they can now generate content that sounds like them, but it's going to be on brand is going to be written in the way that they want. And so that output is really important for helping to scale and accelerate your content processes.
Joab Jackson 4:16
That's a great idea. What do you say content? What do you mean, what sorts of content? I mean? Yeah, I mean, there
Ryan Johnston 4:21
are your very clear use cases, a blog post press releases, but there's also product descriptions. And there's in product content. We've got a whole content design customer base that is thinking about what is the content that's within the product. They're in there using our integration with figma to get the right content in there for each new feature that they're releasing or developing. So when you think of content, you can think about the broad swath of just text that's being created. Writing that's happened wherever happening wherever it's happening,
Joab Jackson 4:49
because who are your customers? Well, what sorts of companies are your customers, I should say?
Ryan Johnston 4:55
We've got an awesome list of customers. We've got everything from UnitedHealthcare to UiPath. Up to Spotify to Uber. And these are all names that you recognize likely for a reason. companies that are investing in their brand care about their brand to care about their content are also companies that are a great fit for writer. And so really, we're able to work so closely together with them, because their mission and the way they plan to grow their business is so aligned with our mission as well. And they care about writing at the end of the day.
Joab Jackson 5:21
Terrific. Who in the company do you usually work with? Do you work with copywriters? Do you work with editors? Do you have marketing people? Who's the Who's your contact?
Ryan Johnston 5:33
Yeah, we've seen a lot of success with content design teams within UX. We've seen a lot of success with content marketers in the marketing organization. But we're having a lot of conversations with CMOS. Right now, we're having a lot of conversations with people who are thinking about generative AI across their business. And so it's becoming very much a situation where people are realizing there's internal and external content being produced by support teams, HR teams sales organization. So it's very much becoming a very pervasive technology that's impacting entire organizations, we've seen a lot of success within the marketing space, and the CMO is our champion as of late
Joab Jackson 6:06
very nice, very nice. Are you eliminating copy writing jobs, or as the role changed a bit?
Ryan Johnston 6:13
Absolutely not. And I've just want to say that's clear for everybody. We are augmenting content teams, copywriters at the end of the day to help them do what they do best, just even better. So we're scaling the mundane parts of the process that you may not love. We're helping you get a first draft on paper when you've got writer's block. But at the end of the day, like our belief is there needs to be a great writer in the driver's seat, you shouldn't be having somebody with strong writing skills, helping you to generate this content, edit this content and get it published. And it should never just be fully reliant on AI to produce things that you're going to immediately take to market.
Joab Jackson 6:46
So there's a button I can push that says, populate the new stack for the next week, I'm going on vacation.
Ryan Johnston 6:53
You can try it, but we definitely.
Joab Jackson 6:55
Hey, so I'm very curious, when you when a customer engages with you? What data does the customer need to provide? What domain data do they have that we need to kind of share with you guys,
Ryan Johnston 7:05
Mr. Customer Data is something we from day one have been very, very sensitive to and trying to make sure that we are secure platform that our businesses and customers can use that we limit the amount of data that we need from a customer as much as possible. We're talking basic profile information, name, product preferences, email, billing information, potentially, but that's about it. Now, the more context that a customer can give us, the better content that we can create for them. So what we can do is we can personalize their model and train their model on their own content, such that if you give us your best blog post, you give us your best content, we can produce a better output for you based on that context. But know that we're not storing that data, we're using that data in what we call a transit way, such that we can reference that data, but it's not actually going into our model. And no other customer has access to that data either. Perfect.
Joab Jackson 7:54
When you say train the model, you can train it, like a new stack. I'm just using our company's example, that a stack has a certain particular style in which we communicate. Beyond that there's the factual information we need to communicate to our sponsors to our readers, but you can actually fine tune for the particular voice if you want it more formal, less formal, that sort of thing?
Ryan Johnston 8:16
Yeah, absolutely. There's, there's two ways we can think about this. One is we've got a customer Adore Me, they're an ecommerce company, they write 1000s and 1000s of product descriptions, product descriptions, you know, as many as you could count, and they've got a specific brand voice that they want to use every time they write those product descriptions. So they've got a custom application set up with us where when they want to write product descriptions at scale, it sounds in the tone of voice that they want to use, that's been trained into the system. In addition to that, what we can also do is we've got inline suggestions that we can provide anybody that's writing with writer, so your style guy can also be loaded into our system, such that when you are writing a new content, you can get suggestions of, hey, we don't use title case, in this situation, we use sentence case, or we don't refer to that product with this name anymore. We we renamed it this other thing. So there's training the model on sounding like your business. And then there's also enabling the writers while they're writing to make sure everything is on brand as well.
Joab Jackson 9:11
Nice. Nice. So yeah, we'd like to take a look behind the covers as much as you guys are comfortable with it. But we'd love to know, where does AI come in, into the making of your service? What does it add?
Ryan Johnston 9:23
Yeah, so I mean, if AI is our large umbrella here, we've got machine learning. We've got natural language processing all happening. And that comes through in some of the examples we returned to talk about your work view, where to take let's say, this podcast recording, when you put it into writer for functionality, we've called it that takeaways in that podcast recording, you wanted to get key takeaways from that. So you wanted to get the summary you wanted to get key quotes from you and I speaking, the AI that we're using then will process that video and do an output of text that will be based on the conversation that's happening here. And so AI is really what's being used to take the inputs of text video on you. And we've got multimedia happening at this portion and producing a text output. And AI is what's doing that, based on the training that's happened to it, the post processing that's happened to give you a positive output for your business.
Joab Jackson 10:12
Terrific. So you create a model for every customer, we are
Ryan Johnston 10:15
using our own proprietary model at this point in time, that's best to fine tune for business use cases. And then for our enterprise customers, we can train the data that they give us into their model to give them a different output. So we've got definitely different motions and different pricing points within the business. If you come in at, you know, a kind of plg motion, you're using our trial experience yourself as our customer, you're likely to be able to get the out of the box functionality, and you can still use your style guide, you can still get the inline writing suggestions. It's just when it comes to training the data and getting your own kind of language model on the back end, that's going to be more for the enterprise customers.
Joab Jackson 10:52
I'd like to circle back on that in a bit. But so this is a I would call it a cloud service. I guess this is
Ryan Johnston 10:58
a service. Yes, yes. All SAS traffic.
Joab Jackson 11:02
So your system has to do all this in real time or fairly real time. Like I've submitted a request and I get a response back a short time later.
Ryan Johnston 11:11
Yeah, in seconds. If you were to create a new blog post right now, based on a prompt of write me a blog post about the five reasons my kid should go to school today, just as an example, you will get an immediate response within the same screen within just a couple of seconds. Is that being processed? Chris it? Can
Joab Jackson 11:25
you say anything about the ML ops system that you're using? Did you have to put together like a framework to do this? Or is there you have multiple customers? So it's a multi tenant environment? Can you talk a little bit about how you set this up?
Ryan Johnston 11:40
Yeah, we have a variety of ml ops technologies that we're using, at the end of the day, we are using these to manage, monitor and optimize our models. So that's a pretty complex system. Example technology that we're using are TensorFlow, Keras, pytorch, etc. So we've got some pretty common technology, but innovative technology there. And then we've also gotten to house technology that we have to build. So we really needed to be able to have a model monitoring dashboard, where we can keep track of how the model is performing. We've gotten an automated deployment system to keep the model up to date. So between that external technology and in house development that we've done, we've been able to produce a bunch of I've been able to talk about on this podcast today. Terrific.
Joab Jackson 12:17
Terrific. You said you started the company started out with transformer technology. What could you explain what does that do? Generally speaking?
Ryan Johnston 12:25
Yes, the transformers are essentially what is powering chat GPT if you are looking at as well. So that's based on the watch language model. But with the large language model, you have transformers that are essentially looking at a sequence of data. So in this sense, the sequence being words or characters within a word, and it can essentially predict then what should be the next output, or the next piece of content, ie the next word coming out of this. And so transformers are really the revolutionary technology that allows us to look at a data set not just as one sentence compared to another sentence, but as the relational situation between certain words that are happening within a large set of data. Excellent,
Joab Jackson 13:01
excellent. And so your transformers are customized to your customers. And they also have other sources of data, like, I guess, English usage, sort of guides, that sort of thing,
Ryan Johnston 13:12
we have a variety of data that we bring in, honestly. So there is publicly available datasets, a lot of those are in the other transformer models that you've probably heard about and exposed to, there are private data sets, we've got some dedic data that we've created as well. So essentially creating data that mimics real world data. And then we also have structured data, image data that's coming into score starting to get into the multimedia piece there. So it's honestly a lot of different data sources. But they've all been selected to work best for our model to get the right output for the business use cases that we're trying to solve.
Joab Jackson 13:42
Very nice. Very nice. So yeah, I'd like to pivot the conversation a bit to not so much chat GPD itself, but the implications that is showing all the businesses out there that are you know, trying to leave a mark out in the world. And so you guys have taken a specific need within within business. But do you see this sort of approach working for others? What sort of other things? Could this sort of transformer based technology be used for?
Ryan Johnston 14:15
Yeah, I mean, we're seeing every single day, I think, really interesting and exciting use cases pop up at this point. So Austin, there's been news recently about Microsoft, bringing this technology into being and essentially the experience that people are so used to with honestly, Google search, Bing is going to create a whole different experience. And a lot of people are enjoying the information retrieval that they can get the research that they can do through this more kind of chat based experience and generative AI approach to it. So that's one area. I think another area that we're going to continue to see changes in is use cases as I mentioned around the multimedia piece earlier. So right now, we talked about chat GBT we're talking text in text out. There's now video audio in in textile. There can also be video, audio and video audio out or More textile as well. So there's going to be this whole new world, I think of where we're thinking about how different inputs create different outputs. But the user applications out that are what can be the most interesting. And I think the Microsoft Bing pieces, the one that's kind of shocked the world the most,
Joab Jackson 15:14
it is, is very definitely very definitely. Now, these days, companies they want to make have a presence on the web and out in the world, they do a lot of work with SEO, search engine optimization, this is where we try to figure out what Google is looking for and modify our articles, the format of our articles, so they are user friendly to the search engine. But this judge EPD sounds like it's coming in. And it's looking at our collection of data and assembling it in completely different ways. Oh, should like companies any any company or data on a website? How should they be preparing their data? For these large language models? This is something do we need to put pointers in? Or? Or how should we think about other search engines coming in and making models of our data?
Ryan Johnston 16:06
That job? I think it's a great question. And it's actually interesting, because I'll take a quick deviation with you here. I think there's a general ethical question that's coming up right now, which is do companies even want their data within these models, there is a concept of opt in versus opt out where most people feel like their content, their artwork, everything, it's just being automatically opted in? Do they have the opportunity to opt out of having things in these models? And so I think that's the first thing for companies before they even tried to think about this within kind of the SEO framework, which is more along the lines of do you even want this to be happening? I think that's the benefit of large language models that are almost single tenant in a sense that they're unique to a business, because then there is a full understanding of where your data is going and whether or not you want that data in the model. To answer your question a little bit more specifically, though, you know, a lot of these models are looking at the data that's available to them to scrape from the internet, essentially, and crawling all the data. So a lot of them are not looking at the data that's in the codebase. Behind the scenes, a lot of times, it's the content that's written on a page, processing images, processing video can be a bit more difficult. So it really does come down to the written word that's on the page, at least for the way that things have worked up until this point. So if there was some intention to best support your model, or your data ending up in a model, it would probably lean in this category have more written content on the page and making sure that almost like an accessibility standpoint, you've got everything clearly available to the web crawler as much as you do the person that's just trying to digest the page.
Joab Jackson 17:36
Excellent. Doing this will ensure that I guess your voice is counted with the voices of many that that Bing or Google will eventually incorporate into their own models.
Ryan Johnston 17:48
You could say that well, yeah, you could say,
Joab Jackson 17:50
excellent. Also, you make an important distinction between internal and external models. Should companies create their own large language models,
Ryan Johnston 17:59
it's no easy feat, it's not easy to think about creating your own model. I mean, kudos to anybody that's trying to do it. I think what I'll say about that topic is, there's not just the clay itself, the model that you've gotten the data that you create, and there's a lot of layers on top of that as well. And it takes a lot of expertise to figure out how to take this raw set of data and you know, the language model and turn that into something that works really well for specific use cases. And you've also got to think about claim detection, you know, how much content that's being produced is actually accurate content, is it safe to publish content is that the type of content that you would want to be using, and then is also tailored for the specific use cases where you are, you're really trying to write a business blog post, you probably want that done with a different model than you want one that works for, you know, writing a Shakespearean play about a day in the park kind of thing. And so for that reason, I think there's a lot of fine tuning that happens, I would probably recommend a business that's in this situation, be thinking about what API's are currently available from companies that are securely handling data and the situation that they could integrate into their product. Or if you're not trying to think about integrating into the product, you think about your own use case, go try and find almost that single tenant model of your own large language model that supports just your data and isn't shared across a bunch of other companies that can all pull in from that same data source. So that's probably your best bet, I think for making progress in this current situation that we're in.
Joab Jackson 19:22
Because I was wonder if you could talk about your the enterprise version of writer in which I understand you just said that you are, at that point, creating a model on behalf
Ryan Johnston 19:32
the customer. Yeah, so what we can do is we've got essentially this base model that we will use that is proprietary to writer and then if the customer wants to train the model on their data as well, too. They can give us access to as much of their data as they're willing to do and we can then train a model based on that. And so we're going to do that uniquely for that customer if UiPath wants to train the model on their content. Hilton is not also getting their model trained on path content. So that's how we're keeping things separate. And we're all also not really storing any of that data, we're just training a model on that.
Joab Jackson 20:03
Interesting, interesting. And so what benefits would the customer get by having this model from you guys, or that you guys are holding of them? Yeah, I
Ryan Johnston 20:11
mean, the benefit is going to be consistently on brand content at scale. And so when you're thinking about having your writers start to use this, and the first time they see a blog post that actually sounds like the way they write, or the first time that they see a press release, that actually is quoting things accurately in the way that they want to quote them, this lightbulb moment kind of happens, where they realize, oh, this is this is helping me improve my processes. This has helped speeding up certain parts of what I do, it's not taking away my creativity. But it is doing a great job of figuring out how to create content for me at a faster pace. That actually sounds like what I want it to sound like. And so that that is the benefits that you get out of a situation like this terrific, because I don't know
Joab Jackson 20:51
if this is a particular service of writer, but is it possible to get more analysis of the company, whatever information they provide you? Can they get more insights back from that model that maybe they haven't discovered before? Like, oh, we use too many semicolons or something like that, I guess?
Ryan Johnston 21:09
Yeah, it's been very interesting. So I'll take the inline suggestions piece that we were talking about earlier, you can see when, let's say a customer is using writer and they're writing lots and lots of content. And those the content is constantly providing or sorry, writers constantly providing new suggestions to say, hey, you need to change this term to this term, or reuse this same snippet of content over and over again, the admins on writer can say this is where we keep having bottlenecks in our content. This is where we keep having places where people are constantly writing this instead of this. So that type of feedback has been invaluable for our customers to understand essentially, where they need to do internal enablement and training. Because people don't quite understand the brand voice. Yep. All right. All right.
Joab Jackson 21:52
Fantastic. Those are the questions I had. Ryan, do you think there are any other aspects you might be worth mentioning, either about writer itself or about this whole growing field
Ryan Johnston 22:02
job? I think we covered a lot of great content today. I mean, I think for everybody, it's an exciting time, there's a lot of opportunities to just go play with everything that's out there. And, you know, I think the most important thing is people are starting to make that transition from okay, this was cool, fun, shiny object to this, this is a real business value for me. And I think that's the transformation that we're seeing this year from, you know, the launch of GPT in December to where we're at in January, we've seen you know, a month happen and or a year happen in a month, from the pace that things are going. And so really, it just comes down to people finding practical real applications from this and kind of has made that shift already from what we're seeing.
Joab Jackson 22:38
Alright, fantastic. And listeners. If you want more information about writer, that URL very easy writer.com. You can find out more info there. Ryan, thanks so much for taking time to talk and listeners. Thank you for tuning in. And we'll be back soon with another episode of the new stack.
Alex Williams 22:58
Thanks for listening. If you liked the show, please rate and review us on Apple podcast Spotify, or wherever you get your podcasts. That's one of the best ways you can help us grow this community and we really appreciate your feedback. You can find the full video version of this episode on YouTube. Search for the new stack and don't forget to subscribe so you never miss any new videos. Thanks for joining us and see you soon.
Transcribed by https://otter.ai