Google: Follow Our Structured Data Requirements to Ensure Rich Result Eligibility

Google’s John Mueller recommends following the company’s official structured data requirements to ensure content is eligible to be displayed as a rich result.

This topic was discussed in the latest installment of the #AskGoogleWebmasters video series in which the following question was addressed:“[Do] we need to use structured data as per the Google Developers site (including required/recommended properties) or can we use more properties from Schema.org apart from the Developers site?”

In response, Mueller says it’s perfectly fine to use structured data properties that aren’t listed in the Google Developers site. Structured data that’s listed on the Developers site is what Google officially supports as rich results – there are numerous other types available for webmasters to use.

With that said, if the goal is to have a web page be displayed as a rich result in Google, then following the company’s official requirements is highly recommended.

Though it’s important to keep in mind that utilizing structured data does not guarantee that a web page will be displayed as a rich result, it simply makes it eligible to be displayed as a rich result.

Using structured data types outside of what Google officially supports is optional, but also acceptable. Even if the structured data is not supported in the form of rich results, it still helps Google understand the content better and rank it accordingly.

Here’s how Mueller explains it:“Independently, you’re always welcome to use structured data to provide better machine readable context for your pages. Which may not always result in visible changes, but can still help our systems to show your pages for relevant queries.”

Google is Testing Search Results Without URLs – We may be looking at the beginning of the end for URLs in search results.

Google appears to be testing the complete removal of URLs from search results, displaying only the website name instead.

This was spotted by a Reddit user who shared the following screenshot in a thread:

” alt=”” />

For comparison, here’s how part of that search result appears when URLs are shown (I’ve circled the difference to make it painfully obvious).

” alt=”” />

Google has slowly been moving away from showing full URLs since the introduction of breadcrumbs a few months ago. Now it seems Google is testing the impact of removing URLs altogether, to the point of not even showing the domain name.

As one Reddit user states in the thread, perhaps the greatest concern about this change is being able to verify the legitimacy of the website being shown n search results:

“In this era of search results that don’t even show the domain name, how’s Google going to keep phishing sites from using the names of the businesses they’re trying to impersonate? Worse get, might Google have to roll this back after discovering phishing sites were able to exploit this lack of domains in the search results to get people to divulge passwords, credit card numbers, and all other sorts of sensitive information?”

Looking at it another way, removing URLs from search results likely won’t change much in terms of SEO and may even alter the perceived value of exact-match URLs.

Here are another Reddit user’s thoughts on the matter:

“I don’t think this is going to change much in terms of SEO, it just removes one factor that separated good content from good ranking. This is going to normalize the perception of users between traditional tld’s and modern tld’s because they cant subconsciously decide whether or not to click on something based on the perceived trustworthiness of the URL.

I think that fundamentally it is a good thing for new domains trying to get into a niche. this is going to devalue the perceived “vanity” of any particular url, which is good in a climate where basically any URL is taken.”

We may be looking at the beginning of the end for URLs in search results.

 

Google Search: Ongoing changes to the SERP design, directing more traffic towards paid advertisements

That means over half of Google’s search engine results lead to no clicks through to websites or properties that aren’t owned by Google.

Google no click results1

This data was attained via Clickstream and first analysed by SparkToro.

Why are fewer people clicking from the SERPs?

This Google organic search trend milestone has been caused by a number of factors:

  • An increase in search engine results page (SERP) features that pull information from external websites directly into Google’s search results. E.g. featured snippets and People Also Ask boxes.
  • Google related properties and features appearing for an increasing number of verticals (Google Flights, Google Maps, YouTube etc.)
  • Ongoing changes to the SERP design, directing more traffic towards paid advertisements

How does this impact my website?

Fewer searches resulting in clicks could mean fewer website visits. This could lead to a decrease in organic traffic, despite an increase in organic visibility or ranking position.

Measuring the performance of a site will be impacted, as traffic and conversions have traditionally relied on users moving from the SERPS to a site.

The extent to which this impacts traffic and conversion is highly dependent on vertical, however, as Google continues to expand its reach, there are lessons that can be learnt for SEO strategy moving forward.

What can we do about this?

With fewer clicks available than before, the organic landscape is more competitive than ever. But organic search still represents a significant opportunity. There are several things that can be done to make sure a website not only survives, but thrives in this new search environment:

  1. Optimise for Google’s SERP features

While some search results may lead to no clicks, being present in the SERP can have other benefits, like brand visibility and keeping your website at the forefront of a searcher’s mind. Crucially, optimising a site for Google features also means that competitors are not taking advantage of the same opportunities.

When investigating which SERP features to target, it’s important to prioritise. A site may choose to focus only on keywords that have a high click through rate (CTR), or may go after increasing visibility rather than clicks; optimising for features even if it results in a reduced CTR from that SERP. Be aware of the landscape and make informed decisions.

  1. Optimise content for Google-owned properties

Making sure a website is in a position to capitalise on traffic from Google-owned properties will help a brand stay visible, even if organic clicks are reduced. For example, a travel company may want to ensure they’re included in Google Flights. And websites with a physical location will want to be optimised for Google’s Local Pack.

  1. Optimise for search journeys

Searches rarely occur in silos; so even if one search results in no clicks, a future search may drive a visit. Ensuring that a website is set up to answer a user’s need at every stage of the search journey will give it the best chance of driving qualified leads.

It’s not enough to just offer a product as the organic search environment gets increasingly competitive and dominated by Google itself.

  1. Continue to build content that places users at the centre of a website experience

Organic search is often a significant part of any marketing campaign; however, SEO is not a silver bullet. Building a product and an experience that customers love is crucial to keep people coming back.

It’s worth noting that Google is currently being investigated by the US Department of Justice over antitrust claims relating to its search results. While we would never recommend relying on external factors like this, it’s worth noting that discussions are ongoing.

Should we target other search engines if Google results are resulting in fewer clicks?

Many SEO campaigns focus attention primarily on optimising for Google.

Despite Google offering fewer clicks from its search results than ever before, it’s still the largest search engine by a significant margin. This means that small gains in Google are equivalent to large increases in smaller search engines.

Search_Engine_Market_Share.png

Google is at the forefront of search engine technology, with other search engines often playing catch-up. Optimising for Google will future-proof a website.

 

This article was written by Dan Cartland from The Drum and was legally licensed through the NewsCred publisher network. Please direct all licensing questions to legal@newscred.com.

Adobe, Microsoft and LinkedIn Join Forces to Accelerate Account-Based Experiences

Adobe (Nasdaq:ADBE) today announced an extension of its partnership with Microsoft and a new integration with LinkedIn that will accelerate account-based experiences (ABX) through new marketing solution integrations. Adobe and Microsoft are aligning key data sources to populate account-based profiles in Adobe Experience Cloud, including Marketo Engage and Microsoft Dynamics 365 for Sales. This will empower B2B marketers and sellers to easily identify, understand and engage B2B customer buying teams. This partnership will drive better orchestration, measurement and delivery of targeted content for a more personalized experience at both the individual and account level on key B2B platforms like LinkedIn.

“Orchestrating the engagement of multiple individuals in a complex marketing and sales journey is at the heart of account-based experiences and what B2B marketers do day in and out,” said Steve Lucas, senior vice president, Digital Experience business, Adobe. “With these new account-based capabilities, marketing and sales teams will have increased alignment around the people and accounts they are engaging, and new ways to measure that business impact.”

“The ability to leverage the power of data to find the right opportunities and use insights helps marketing and sales to plan their next move with a member of the buying committee,” said Alysa Taylor, corporate vice president of Business Applications and Global Industry at Microsoft. “Together with Adobe and LinkedIn, Microsoft can help to deliver an end-to-end solution that ultimately accelerates lead conversion and can create opportunities for improved servicing and better cross sell, resulting in higher lifetime value of the account.”

“One of the biggest challenges for marketers running campaigns is ensuring that their messages are reaching the right audiences and delivering ROI at scale,” said Jen Weedn, vice president of Business Development at LinkedIn. “By extending LinkedIn account-based marketing capabilities to Adobe Experience Cloud users, we’ve created a seamless way for them to identify and target the right audiences on LinkedIn with meaningful content, ultimately helping improve the success of their campaigns.”

The partnership further deepens the longstanding collaboration between Adobe and Microsoft. The addition of an integration with LinkedIn Marketing Solution offers new ways for marketers to more effectively engage with accounts and buying teams, enabling them to:

Gain a deeper, real-time understanding of targeted accounts: Marketing and sales teams will be able to leverage data from Marketo Engage and Microsoft Dynamics 365 for Sales to get a deeper, real-time understanding of targeted accounts, including insights into individual roles, influence and preferences. By integrating LinkedIn’s Matched Audiences with Marketo Engage, the combined account-based targeting capabilities will help marketers identify the right contacts within an account to reach on LinkedIn.
Target audiences more effectively by leveraging richer account profiles: To identify the best-fit accounts to pursue, Marketo Engage’s Account Profiling capability combines the power of intelligence-driven predictive modeling and automation in a single ABX solution, empowering marketers to identify these accounts out of 25 million companies within minutes.
Power people-based campaigns with more precision than ever before: Adobe Audience Manager, the company’s Data Management Platform (DMP), will enable brands to stitch together audience data to power contact-based campaigns on LinkedIn and other channels, while informing media buys with more precision than ever before. Data governance and privacy controls help to ensure that customer data is kept secure and helps brands to comply with their privacy policies and data privacy compliance objectives.
About Adobe

Adobe is changing the world through digital experiences. For more information, visit www.adobe.com.

LinkedIn Announces New Data Partnership with Adobe to Improve Ad Targeting

LinkedIn continues to expand its audience targeting capability, this time through a new partnership with Adobe which will expand LinkedIn’s account-based marketing capabilities to Adobe Experience Cloud users.

As explained by Adobe:

“Adobe and Microsoft are aligning key data sources to populate account-based profiles in Adobe Experience Cloud, including Marketo Engage and Microsoft Dynamics 365 for Sales. This will empower B2B marketers and sellers to easily identify, understand and engage B2B customer buying teams.”

The deal will essentially enable marketing and sales teams to utilize data from LinkedIn, Marketo Engage and Microsoft Dynamics 365 to gain greater insight into the audiences they need to reach, and target them more effectively. So say, for example, an Adobe customer is selling office supplies – they’ll now be able to utilize LinkedIn’s audience data and ad targeting tools to show ads to the specific decision-makers, based on job roles, locations, etc., to more effectively market their offerings.

The main target for this new partnership appears to be Salesforce, which provides similar targeting and reach capacity, though without the full, in-depth professional and career dataset of LinkedIn. As you may recall, back in 2016, when Microsoft’s pending takeover of LinkedIn was first announced, Salesforce voiced its opposition to the merger, noting that:

“Microsoft’s proposed acquisition of LinkedIn threatens the future of innovation and competition. By gaining ownership of LinkedIn’s unique dataset of over 450 million professionals in more than 200 countries, Microsoft will be able to deny competitors access to that data, and in doing so obtain an unfair competitive advantage.”

This new deal, partnering Adobe’s back-end tools with LinkedIn’s insights, will definitely broaden their combined offering – and with LinkedIn now serving more than 610 million members, and seeing ‘record levels’ of engagement, that data resource is growing every day.

As noted, this is the latest in LinkedIn’s efforts to make better use of its professional dataset, and expand its advertising potential. The platform also recently launched lookalike audiences and a new integration with search data from Microsoft Bing, further building on its targeting capacity.

Each of these moves has significant implications for advertisers, making LinkedIn a more powerful tool for reaching the right people.

Adobe Expands Integrations With AI-Drive Account Profiling, LinkedIn Synching

Adobe has launched new capabilities within the Marketo Engage ABM Essentials offering, part of the Adobe Marketing Cloud, including new AI-powered models in its Account Profiling tool and an expanded LinkedIn partnership. The two updates aim to help accelerate the execution of ABM strategies for sales and marketing teams, as well as drive personalized and account-based experiences.

The Marketo Engage Account Profile feature, powered by AI, now includes predictive analytics, designed to suggest potential net-new accounts and help users better uncover new target accounts from more than 500 million data points within the platform.

Adobe Marketing Cloud has also expanded on its LinkedIn Matched Audiences integration, which aims to position users to sync lists of accounts into LinkedIn through Marketo Engage to create personalized ad targeting. The LinkedIn extension can also filter for different target account characteristics.

“Identifying the right target accounts and key decision makers shouldn’t deter marketers from launching or implementing an ABM strategy,” said Brian Glover, Director of Product Marketing of Marketo Engage at Adobe, in a statement to Demand Gen Report. “It shouldn’t be a guessing game or require weeks of manual work either. With the latest enhancements to Marketo Engage’s Account Profiling capability and our expanded LinkedIn integration, B2B marketers can identify their best-fit accounts in minutes out of a universe of 25 million companies and then target key decision makers at those accounts on LinkedIn.”

NLG in the newsroom: fast, consistent, and hyperlocal

NLG in the newsroom: fast, consistent, and hyperlocal

We’ve written about how Natural Language Generation can eliminate the bottleneck of manual, one-at-a-time analysis within business environments, producing data-driven insights that otherwise would remain fossilized in spreadsheets on the network drive. It is also worth noting that the same principles apply to the world of news reporting, where there is so much valuable data to consider that editors—when assigning stories based on limited human analytical capacity—are forced to leave many data sets completely unexamined. Editorial prioritization tends to mirror demand, leading to the omission of hyperlocal content that is highly useful, but to only a small subset of readers. A Natural Language Generation platform—particularly when it is open, extensible, smart, and secure (like ours!)—solves this problem.

This is not a theoretical statement, but one that is actively illustrated today by UK-based RADAR AI, and by BBC News Labs. Both organizations are using Arria’s NLG platform to publish high-quality stories that otherwise would simply never be written.

RADAR AI Hits a Milestone: 200,000 Stories

RADAR (an acronym for Reporters and Data and Robots) is a joint venture between data-journalism start-up Urbs Media and the Press Association, the UK national news agency. Just last week, after being open for business for only fourteen months, RADAR AI published its 200,000th story: “Crown court waiting times increase by more than seven months in Newcastle,” by Harriet Clugston, Data Reporter. (Nice job title. Notice how effectively it stakes out Clugston’s data-driven approach while also establishing her beat as essentially boundary-free. With a title like that, a journalist can use any data set as a starting point for investigation and explanation.)

Relatively few in the world are concerned about Crown Court waiting times in Newcastle, a city of approximately 300,000 as of 2018. In fact, we can guess that relatively few of the 300,000 citizens of Newcastle are concerned about the Crown Court waiting times. Generally speaking, our interest in this subject is proportional to the degree of our civic or personal interest in the municipality or the court. We’re mildly interested if we’re paying taxes to keep the system running, but probably not highly interested unless ours is one of the 455 cases waiting to be heard. For understandable, practical reasons, an editor probably would not have assigned a reporter to analyze how caseloads and waiting times have changed over the years, and then to write a story describing movements in the data. A subject of broader interest would win the day and the Crown Court story—which does in fact contain valuable insights for those who are interested in the topic—would never have appeared.

Fortunately the RADAR approach “breaks the ‘content compromise’ which forces organisations to choose between high quality, reliable and bespoke content or mass-produced superficial output.” Makes perfect sense. Does an editor really want Harriet Clugston to spend her time crunching and describing numbers? No, Arria’s NLG platform can do most of that for her, freeing her to perform a level of reporting that makes the hyperlocal piece read like a story of broader interest. In the brief, information-packed article, Clugston includes direct quotations from four individuals who are involved in the UK criminal justice system:

  • Stephanie Boyce, Deputy Vice President of the Law Society of England and Wales;
  • John Apter, Chair of the Police Federation of England and Wales;
  • Sara Glenn, Deputy Chief Constable for Hampshire Constabulary; and
  • Spokeswoman for HM Courts and Tribunals Service.

With NLG in an assisting role, Clugston has taken the opportunity to maximize the value of her story to the few citizens of Newcastle who are fretting about longer wait times at Crown Court.

Let’s do the math on those 200,000 stories that RADAR has published since opening for business in June of 2018. That is a rate of approximately 13,300 stories per month, 430 per day. Not bad for an organization that has only seven employees on Linkedin!

For comparison, by its own reckoning The New York Times—which employs approximately 1,300 staff writers—publishes roughly 200 “pieces of journalism” per day, with “pieces of journalism” likely including blog posts and interactive graphics, in addition to stories. Following is an excerpt from a 2017 internal report created by the “the 2020 group” of Times editors tasked with spending the prior year examining editorial policies and practices at the paper:

The Times publishes about 200 pieces of journalism every day. This number typically includes some of the best work published anywhere. It also includes too many stories that lack significant impact or audience—that do not help make The Times a valuable destination.”

A couple of paragraphs later, the report states the problem even more plainly: “We devote a large amount of resources to stories that relatively few people read…. It wastes time—of reporters, backfielders, copy editors, photo editors, and others—and dilutes our report.”

It would appear that RADAR has found a way to address these concerns. A story about Crown Court waiting times probably lacks a significant audience, but does have a significant impact on a small audience. Especially if its appearance or delivery can be targeted to readers most likely to be interested, a hyperlocal story such as this one represents a step in the right direction rather than the dilution of “some of the best work published anywhere.”

The Newcastle story is available to us only as a screenshot, but here is another recent story from Harriet Clugston to which all of the observations above are applicable: NHS staff took almost 7,000 full-time days of sick leave because of drug or alcohol abuse last year, figures reveal.

BBC News Labs and SALCO Part 1

The BBC, too, facing heightened expectations for the frequency and quality of local news content, has commenced a ‘Semi-automated Local Content’ initiative, SALCO for short. BBC News Lab developers Roo Hutton and Tamsin Green began by developing pipeline that reported Accident and Emergency statistics on more than one hundred local hospitals—interesting information but, again, not the best use of top-notch journalists’ time. As Hutton explains in his excellent article from March of this year, “Stories by numbers: How BBC News is experimenting with semi-automated journalism,” “Automated journalism isn’t about replacing journalists or making them obsolete. It’s about giving them the power to tell a greater range of stories— whether they are directly publishing the stories we generate, or using them as the starting point to tell their own stories— while saving them the time otherwise needed to analyse the underlying data.”

Hutton describes a respectful, cooperative approach during which he and Green work closely with journalists in order to learn how they think, and also to help the journalists understand how Arria’s platform works, and why traditional writers should embrace NLG.

“This story has been generated using BBC election data and some automation”

The experiment was such a success that BBC News Labs decided to take the same approach to covering local elections in May of this year. Writing on the News Labs site a couple of weeks ago in, “Salco part 2: Automating our local elections coverage,” Tamsin Green explains the rationale, both in terms of workload volume and the need for consistency in coverage: “Local elections on BBC News Online are covered at both a national level, to aggregate results and highlight trends, as well as locally by journalists working out of regional hubs. With 248 councils up for election in England alone, that means a huge amount of journalism in a short period of time. We observed huge variation in election coverage across the country: Some councils were not covered. Some would simply take tweets from the @bbcelection Twitter feed. Others were there at the count, posting photographs and detailed results.”

This is a textbook case for NLG, and the sample output from BBC News Labs looks good. As you contemplate it, consider the variance in style and content that would naturally arise if reporters were left to configure the information themselves from municipality to municipality—and consider how long it would take to assemble even a hodgepodge of inconsistent reporting.

Twilio: Harnessing The Power Of AI (Artificial Intelligence)

Twilio, a provider of voice, video and messaging services, reported its second quarter results and they were certainly standout. Revenues spiked by 86% to $275 million and there was a profit of 2 cents a share.

On the earnings call, CEO Jeff Lawson noted: “We have the opportunity to change communications and customer engagement for decades to come.”

And yes, as should be no surprise, one of the drivers will be AI (Artificial Intelligence).  Just look at the company’s Autopilot offering (at today’s Signal conference, Twilio announced that the product is generally available to customers).  This is a system that allows for the development, training and deployment of intelligent bots, IVRs and Alexa apps.

Now it’s true that there is plenty of hype with AI. Let’s face it, many companies are just using it as marketing spiel to gin up interest and excitement.

Yet Autopilot is the real deal. “The advantage that’s unique to Twilio’s API platform model is that we build these tools in response to seeing hot spots of demand and real need from our customers,” said Nico Acosta, who is the Director of Product & Engineering for Twilio’s Autopilot & Machine Learning Platform. “We have over 160 thousand customers of every size across a huge breadth of industries and we talk to them about the solutions they need to improve communication with their customers. What do they keep building over and over? What do they actively not want to build because it’s too heavy a lift? Those conversations inform the products we create that ultimately help them differentiate themselves through a better customer experience.”

AI Innovation

Consider that Autopilot breaks the conventional wisdom that there is an inherent tradeoff between operational efficiency and customer experience.  To do this, Twilio has been focusing on pushing innovation with AI, such as with:

  • Classification: This involves grouping utterances and mapping them to the correct task. With AI, the system gets smarter and smarter.
  • Entity Extraction: This uses NLP (Natural Language Processing) to locate details like time, place, cities, phone numbers and so on. This means it is easier to automate repetitive tasks like setting up appointments (if the customer says “7 at night,” the NLP will understand this).

YOU MAY ALSO LIKE

There are definitely some interesting use cases for Autopilot. One is with Green Dot, which is a next-generation online bank. A big challenge for the company is that its customers are often new to financial services. But with Autopilot, Green Dot has been able to develop a conversational agent that works on a 24/7 basis, on both the IVR and chatbot system in the mobile app. The company also gets a view of important metrics like usage, time spent and common questions about products.

Here are some other interesting use cases:

  • Insurance: Generate quotes automatically, file claims easily, and answer FAQs.
  • Hospitality: Offer virtual concierge services, answer FAQs, and manage reservations programmatically.
  • Real Estate: Field and generate leads, schedule appointments programmatically, and answer questions about listings.
  • Retail & Ecommerce: Allow customers to search products, take advantage of promotional offers, and check delivery status.

Keep in mind that changing a traditional IVR system can be complicated and time-consuming—much less having the ability for AI. But with Autopilot, the development is lightning fast. You can create bots with simple JSON syntax and deploy them on multiple channels with zero code changes. There are also easy-to-use tools for training the AI models.

Takeaways With Autopilot

With the development of Autopilot, there were some important learnings for AI. “No single model can handle the many different use cases,” said Acosta. “Because of this, we created a multi-model architecture that adjusts in real time. For example, if there is large amounts of data, a deep learning algorithm might be best. But if not, a more traditional model could be better.”

Regardless of the technical details, Autopilot does point to the incredible power of AI and how it is poised to upend the software market.

“The potential of AI to transform communications is huge, but there’s a big delta between that potential and how companies are actually using it at scale,” said Acosta.  “So at Twilio, we are focused on the building blocks that customers need to put the innovation to work.”

Tom (@ttaulli) is the author of the book, Artificial Intelligence Basics: A Non-Technical Introduction.

Artificial Intelligence Search, NLP & Automation

Another significant requirement is the need to find an efficient method for reducing the amount of computational searching for a match or a solution. Considerable important work has been done on the problem of pruning a search space without affecting the result of the search. One technique is to compare the value of completing a particular branch versus another. Of course, the measurement of value is a problem. As real-time applications become more important, search methods must become even more efficient in order for an Al system to run in real-time.

Natural Language Processing

There has been an increasing amount of work on the problem of language understanding. Early work was focused on direct pattern matching in speech recognition and parsing sentences in processing written language. More recently, there has been more use of knowledge about the structure of language and speech to reduce the computational requirements and improve the accuracy of the results. There are several systems that can recognize as many as several thousand words, enough for a fairly extensive command set in a “hands busy” application, but not enough for business text entry (dictation to text).

A number of production natural language command systems are capable of understanding structured English commands. These systems are context-sensitive and require that a situation-specific grammar and vocabulary be created for each application.

Automatic Programming

An obvious application for Al technology is in the development of software without some of the more tedious aspects of coding. There has been some research on various aspects of software program development. Arthur D. Little, Inc. has developed a structured English to LISP compilation system for a client and an equivalent commercial system has recently been announced.

It should soon be possible to build a Programmer’s Assistant that will assist in the more routine aspects of code development although no development has apparently been completed beyond a system that assists in the training of ADA programmers and a prototype system that converts a logical diagram to LISP (Reasoning Systems). True automatic programming that will relieve a programmer of the responsibility for logical design seems to be some time in the future.

But ‘Automation’ in AI from self-learning systems to systems that are designed to assist or automate certain tasks have come under fire for their potential to be flawed.  It is very difficult for the flawed or biased human mind to create and automate a system, process or task without the possibility of including some of our inherent flaws and biases.  Thus, it has been challenged recently that these AI solutions being developed for automation may be including flawed logic elements.

This has been proved, to a degree, by some of the Microsoft attempts to automate chatbot learning by exposing it to content on the internet or at least content within Twitter.  The system very quickly picks up on biased views and can quickly become very politically incorrect in a short period of time.

Amazon’s Alexa as a presentation layer over a Tableau dashboard integrated with Arria’s NLG technology

A Big Day for Arria at VOICE Summit 2019
Greg Williams

I write this dispatch while sitting with Arria teammates outside of the crowded meeting space where Arria’s COO, Jay DeWalt, and Chief Scientist, Ehud Reiter, are unveiling the breakthrough use of Amazon’s Alexa as a presentation layer over a Tableau dashboard integrated with Arria’s NLG technology. We’re happy to listen through the open door, giving up our seats to VOICE Summit attendees, many of whom we met at the Arria booth this morning. It’s great to see interest in Natural Language Generation at such high levels. NLG occupies a unique and absolutely essential layer of the technology stack that will make possible dynamic, multi-turn conversations with machines.

Jay and Ehud are joined by Kapila Ponnamperuma, Arria’s Head of Technology Integrations, and we know that Kapila is going to demonstrate the technology live in a few minutes. Since we’ve seen the demo, we know what the audience is in for. BI dashboards plus NLG are already impressive enough. Just wait until the attendees witness Kapila asking questions related to sales performance, and Alexa responding intelligently, remembering context to support follow-on questions. . . .

In the presentation leading up to the demo, Jay and Ehud make the point that data comprehension is more difficult when looking at raw data than at visuals, and more difficult when looking at visuals alone than at visuals combined with narrative written in natural human language. Hence the rapid pace of Arria’s BI dashboard integrations.

By combining an Arria-integrated BI dashboard with Alexa, or other conversational platforms, Arria takes it one step further: facilitating dynamic conversational AI for business.

Arria at the VOICE SummitArria stands out at VOICE Summit in being one of the few exhibitors primarily interested in business applications rather than consumer applications. (We also happen to be wearing fluorescent neon orange golf shirts, selected by SVP of Strategic Partnerships and Business Development, Lyndsee Manna, so we’re easy to spot.) Arria is dedicated to using the power of language to helping businesses achieve greater efficiency, discover deeper insights in their data, and ultimately make better, smarter decisions than their competitors.

Update—A Few Minutes Later

It was an extraordinary demo, extraordinarily well-received. Kapila quizzed Alexa about sales performance across multiple measures and dimensions. Conversationally, without a mouse, he achieved the equivalent of drilling down into a BI dashboard, and the audience heard Arria respond immediately with actionable information.

In the discussion that followed, Kapila made the point that if you have an existing BI Dashboard, you can be up and running in hours, with a sophisticated multi-turn conversation application that remembers context to facilitate follow-on questions.

The audience and the Arria presenters were so engaged with one another that an administrator had to call time in order to clear the room for the next session. Jay offered to continue the session in the hallway. Clusters of attendees kept him and Ehud busy fielding questions for another half an hour.

Tomorrow we’ll check in after Cathy Herbert delivers her VED Talk in the afternoon. (“TED Talk,” but with a V for VOICE.) Cathy will provide guidance on how companies can position themselves to take advantage of the forthcoming avalanche of improvements in Natural Language Generation, particularly when the NLG platform offers built-in computational and linguistic functions and is paired with a conversation platform such as Alexa.

Until then, signing off.