Last November, VentureBeat printed an investigation into Alexa Answers, Amazon’s service that permits any buyer to submit responses to unanswered questions. When it launched on the whole availability, Amazon gave assurances that submissions could be policed via a mixture of computerized and handbook assessment. But our evaluation revealed that unfaithful, deceptive, and offensive questions and solutions are served to tens of millions of Alexa customers.

In January, Amazon rolled out Alexa Answers as a part of an invitation-only program in Germany, permitting some prospects there to contribute solutions. Around that very same time, it additionally started serving prospects in English-speaking international locations the place Alexa is on the market the U.S. responses. But problematic user-submitted content material continues to make its means onto Alexa Answers — and by extension, to Alexa units and the shoppers who use them.

Questionable questions

The questions in Alexa Answers come from Alexa prospects who ask inquiries to which the assistant doesn’t have a solution. (According to Amazon, daily Alexa solutions tens of millions of questions that it’s by no means been requested earlier than, a “substantial majority” of which it solutions with vetted sources.) Once a query has been requested a sure variety of occasions — Amazon declined to say what number of — it makes its means onto the Alexa Answers portal, the place it’s truthful recreation for anybody with an Amazon account.

Alexa Answers expands to more countries, but still isn’t properly vetted

New questions are these accepted for his or her novelty, whereas Hot questions are these with out a solution which were requested by “a lot” of Alexa prospects. (One instance is questions having to do with breaking information.) As for Popular questions, Amazon considers them “interesting” as a result of they’ve a number of potential solutions, resulting in “lots” of answering exercise on Alexa Answers. (For extra details about how questions in Alexa Answers are transcribed, how they’re categorized, and the points-based system that’s used to rank every contributed reply, discuss with our earlier report.)

Alexa Answers expands to more countries, but still isn’t properly vetted

Questions sourced from Alexa Answers are appended with an “According to an Amazon customer” disclaimer, regardless of which Alexa-enabled system solutions the query. Perhaps unsurprisingly, controversial questions seem with some frequency on the platform, like:

The reply to the primary query is comparatively innocent — “You would assassinate the president of the United States the same way you would any other President” — however the truth that the query hasn’t been eliminated factors to gaps within the moderation processes.

Alexa Answers expands to more countries, but still isn’t properly vetted

The second query is factually unfaithful. George W. Bush was not killed, and is the truth is nonetheless alive, however the responses to the query — each of which discuss with George H.W. Bush, who handed away in November 2018 of vascular Parkinsonism — are deceptive. Any Alexa system person who posed the query could be led to consider that each George W. Bush and George H.W. Bush had been lifeless.

Alexa Answers expands to more countries, but still isn’t properly vetted

The third is an uncouth joke, however it nonetheless falls into the class of questionable questions. It’s telling that different assistants, together with Google Assistant, refuse to reply to the immediate. An Alexa Answers person provided: “Hitler tied his shoes with little Nazis.”

Alexa Answers expands to more countries, but still isn’t properly vetted

Questionable solutions

Amazon says that questions submitted to Alexa Answers may be mechanically rejected by a mixture of automated and handbook filters in the event that they fall into sure classes. Indeed, searches for questions on sure controversial subjects — like “Holocaust,” “Trump,” and “suicide” — return no outcomes. (Again, see our earlier report for extra data.)

Controversial, offensive, and factually incorrect solutions

A subcategory of questions goal to suss out the “best” athletes, drinks, electronics, and extra. They’re problematic not solely as a result of they’re subjective in nature, however as a result of most of the solutions lack supporting proof:

One reply to the query concerning the world’s greatest sandwich seems to have been supposed for one more query. It launches right into a historical past of Isabella’s Islay scotch whiskey, which clearly has nothing to do with sandwiches — until among the components are soaked in rye whiskey, after all.

Alexa Answers expands to more countries, but still isn’t properly vetted

As for the questions concerning the world’s greatest golfer, espresso, basketball participant, snowboarder, and so forth., the responders cite no sources to again up their assertions. One posited that “Ron Seyka” is one of the best wrestler, however details about Seyka is hard to come back by. In truth, it’s unclear if this individual was knowledgeable wrestler in any capability.

Here are extra questions which can be troublesome from the beginning, some with equally troublesome solutions.

One of the solutions to the Bloomberg query — “Michael Bloomberg is a candidate for President with the Democratic Party. However there is much suspicion as to his motives and many people believe he is an operative of the Republicans” — injects political opinion into what begins as a factual reply. Bloomberg was a registered Republican from 2001 to 2007, however he switched to the Democratic occasion in 2007.

Alexa Answers expands to more countries, but still isn’t properly vetted

Where the query about Google Home is anxious, it’s a protected assumption that “rating” is supposed to discuss with a buyer or reviewer score (just like the one-to-five-star critiques that seem in product listings on Amazon, eBay, and Best Buy). The reply — “Poor” — is a mirrored image of the worst of the Alexa Answers group. What might need been supposed as a prank or joke may affect an individual’s buying resolution.

Alexa Answers expands to more countries, but still isn’t properly vetted

The third query — “Who was the first black man on the moon?” — comprises a false premise. No African American astronaut has walked on the moon up to now, though one person gives “Bernard Anthony Harris Jr.” as a possible reply. This is wrong — whereas Harris grew to become the primary African American to carry out a spacewalk in February 1995, he by no means stepped foot on the moon. Interestingly, Alexa not often surfaces this reply when requested the query, however as an alternative erroneously responds with the reply “Neil Armstrong.”

Alexa Answers expands to more countries, but still isn’t properly vetted

Protected course of

As we famous the final go-around, Alexa Answers suffers from the shortcomings of the question-and-answer platforms that got here earlier than it, maybe most famously Yahoo Answers, WikiAnswers, and StackExchange. (Amazon itself has run into issues with fake reviews on its retail website.) As issues stand, it’s incumbent upon Alexa Answers customers to reply questions totally and in good religion, and to self-police past the behind-the-scenes automated filtering.

There’s no silver bullet, however Amazon would do properly to require citations to make it simpler for customers to see the place data is coming from. It may additionally step up its moderation efforts by bettering the automated methods presently in place or by increasing the dimensions of the human reviewer workforce. And it may contemplate migrating towards a mannequin akin to Reddit’s, the place subject-matter consultants conduct moderation by topical space.