Facebook buys ads in Indian newspapers to warn about WhatsApp fakes

As Twitter finally gets serious about purging fake accounts, and YouTube says it will try to firefight conspiracy theories and fake news flaming across its platform with $25M to fund bona fide journalism, Facebook-owned WhatsApp is grappling with its own fake demons in India, where social media platforms have been used to seed and spread false rumors — fueling mob violence and leading to number of deaths in recent years.

This week Facebook has taken out full page WhatsApp -branded adverts in Indian newspapers to try to stem the tide of life-threatening digital fakes spreading across social media platforms in the region with such tragic results.

It’s not the first time the company has run newspaper ads warning about fake news in India, though it does appear to be first time it’s responded to the violence being sparked by fakes spreading on WhatsApp specifically.

The full page WhatsApp anti-fakes advert also informs users that “starting this week” the platform is rolling out a new feature that will allow users to determine whether a message has been forwarded. “Double check the facts when you’re not sure who wrote the original message,” it warns.

https://platform.twitter.com/widgets.js

This follows tests WhatsApp was running back in January when the platform trialed displaying notifications for when a message had been forwarded many times.

Evidently WhatsApp has decided to take that feature forward, at least in India, although how effective a check it will be on technology-accelerated fakes that are likely also fueled by local prejudices remains to be seen.

Trying to teach nuanced critical thinking when there may be a more basic lack of education that’s contributing to fomenting mistrust and driving credulity, as well as causing the spread of malicious fakes and rumors targeting certain people or segments of the population in the first place, risks both being ineffectual and coming across as merely irresponsible fiddling around the edges of a grave problem that’s claimed multiple lives already.

Facebook also stands accused of failing to respond quickly enough to similar risks in Myanmar — where the UN recently warned that its platform was being weaponized to spread hate speech and used as a tool to fuel ethnic violence.

Reuters reports that the first batch of WhatsApp fake ads are running in “key Indian newspapers”, and images posted to Twitter show an English-language full-page advert — so you do have to question who these first ads are really intended to influence.

But the news agency reports that Facebook also intends to publish similar ads in regional dailies across India over the course of this week.

We’ve reached out to WhatsApp with questions and will update this story with any response.

“We are starting an education campaign in India on how to spot fake news and rumours,” a WhatsApp spokesman told Reuters in a statement. “Our first step is placing newspaper advertisements in English and Hindi and several other languages. We will build on these efforts.”

The quasi-educational WhatsApp fake news advert warns users about “false information”, offering ten tips to spot fakes — many of which boil down to ‘check other sources’ to try to verify whether what you’ve been sent is true.

Another tip urges WhatsApp users to “question information that upsets you” and, if they do read something that makes them “angry or afraid”, to “think twice before sharing it again”.

“If you are not sure of the source or concerned that the information may be untrue, think twice before sharing,” reads another tip.

The last tip warns that “fake news often goes viral” — warning: “Just because a message is shared many times, does not make it true.”

In recent times, Facebook has also run full-page ads in newspapers to apologize for failing to safeguard user data in the wake of the Cambridge Analytica scandal, and taken out print adverts ahead of European elections to warn against attempts to spread fake news to try to meddle with democratic processes.


Source: https://techcrunch.com/2018/07/10/facebook-buys-ads-in-indian-newspapers-to-warn-about-whatsapp-fakes/

Advertisements

The Great British Hack-Off summer festival hackathon will take aim at Brexit

While the chaos of Brexit plays out at the top of UK politics this week, it’s very hard to know what the effects of Brexit are “on the ground”. Local news no longer has much of a business model to concentrate on specific subjects or campaigns. Social media is a mess of local facebook groups which only locals can see. MPs often ignore email / online campaigns from constituents.

The Great British Hack-Off aims to address this. In a 2-day intensive, overnight “hackathon” on the weekend of July 21-22, the aim will be to get a groundswell of interest in helping to improve local communities and economies. The aim is to connect people with their decision makers and MPs about their concerns for jobs, services and the economy.

The event is also going to be used to pressure MPs to back a new, ‘people’s vote’ on the terms of Brexit, with the option to remain in the EU.

It will be held by Tech For UK (Twitter, Hashtag:#GBhackoff, Instagram,
Facebook) the tech industry body calling for a meaningful people’s vote on Brexit, with the option to Remain, and anti-Brexit group Best For Britain .

Anyone interested can apply to attend the event via this form.

The Great British Hack-Off will ask a number of questions and try to build products to address the answers.

Are local community projects, some formerly funded by the EU, still going? Are they being replaced? What about local factories, businesses? What about health Services? Are local or central governments stepping in to help, or are people’s concerns being ignored? Is European and other foreign investment ebbing away from local communities or is it being replaced? Are local news sources sharing what is going on?

What are the human stories? How can social media and video be used to tell those stories best?

The Great British Hack-Off will be a festival of tech and creativity to address these issues.

Tech For UK says the ultimate goal will be to engage the tech community to help Best for Britain connect people in local communities to the information they need on Brexit and, in turn, connect them to their decision makers and MPs. Attendees to the Hackathon will also be able to work on their own projects and ideas related to Brexit.

Tech For UK says this will be the first event in a series, to be continued at other cities around the UK, not just in London.

Structured like a “Hackathon” it will be held at a central London venue, bringing together engineers, designers, storytellers, marketers, data scientists, designers, artists, journalists / PR / media people, analytics experts and social media influencers to work on these problems.

Participants will be selected from applications and given full instructions about the event.

There will be capacity for 120 people and the opportunity to stay over-night at the hackathon. Food and beverages will be provided. This will be an ’18s and over’ event. There will also be wheelchair access.


Source: https://techcrunch.com/2018/07/09/the-great-british-hack-off-summer-festival-hackathon-will-aim-at-brexit/

Facebook was never ephemeral, and now its Stories won’t have to be

Before Snapchat made social media about just today, Facebook made it about forever. The 2011 “Timeline” redesign of the profile and keyword search unlocked your past, encouraging you to curate colorful posts about your life’s top moments. That was actually an inspiration for Snapchat, as its CEO Evan Spiegel wrote in its IPO announcement that “We learned that creativity can be suppressed by the fear of permanence.”

Now Facebook is finding a middle ground by optionally unlocking the history of your Stories that otherwise disappear after 24 hours. Facebook will soon begin testing Stories Highlights, the company confirmed to TechCrunch. Similar to Instagram Stories Highlights, it will let you pick your favorite expired photos and videos, compile them into themed collections with titles and cover images and display them on your profile.

The change further differentiates Facebook Stories from the Snapchat Stories feature it copied. It’s smart for Facebook, because highly compelling content was disintegrating each day, dragging potential ad views to the grave with it. And for its 150 million daily users, it could make the time we spend obsessing over social media Stories a wiser investment. If you’re going to interrupt special moments to capture them with your phone, the best ones should still pay dividends of self-expression and community connection beyond a day later.

Facebook Stories Highlights was first spotted by frequent TechCrunch tipster Jane Manchun Wong, who specializes in generating screenshots of unreleased features out of the APK files of Android apps. TechCrunch inquired about the feature, and a Facebook spokesperson provided this statement: “People have told us they want a way to highlight and save the Stories that matter most to them. We’ll soon start testing highlights on Facebook – a way to choose Stories to stay on your profile, making it easier to express who you are through memories.”

These Highlights will appear on a horizontal scroll bar on your profile, and you’ll be able to see how many people viewed them just like with your Stories. They’ll default to being viewable by all your friends, but you can also restrict Highlights to certain people or make them public. The latter could be useful for public figures trying to build an audience, or anyone who thinks their identity is better revealed through their commentary on the world that Stories’ creative tools offer, opposed to some canned selfies and profile pics.

Facebook paved the way for Highlights by launching the Stories Archive in May. This automatically backs up your Stories privately to your profile so you don’t have to keep the saved versions on your phone, wasting storage space. That Archive is the basis for being able to choose dead Stories to show off in your Highlights. Together, they’ll encourage users to shoot silly, off-the-cuff content without that “fear of permanence,” but instead with the opportunity. If you want to spend a half hour decorating a Facebook Story with stickers and drawing and captions and augmented reality, you know it won’t be in vain.

Facebook Stories constantly adds new features, like this Blur effect I spotted today

While many relentlessly criticize Facebook for stealing the Stories from Snapchat, its rapid iteration and innovation on the format means the two companies’ versions are sharply diverging. Snapchat still lacks a Highlights-esque feature despite launching its Archive-style Memories back in July 2016. Instead of enhancing the core Stories product that made the app a teen phenomenon, it’s concentrated on Maps, gaming, Search, professional Discover content, and a disastrously needless redesign.

Facebook’s family of apps seized on the stagnation of Snapchat Stories and its neglect of the international market. It copied whatever was working while developing new features like Instagram’s Superzoom and Focus portrait mode, the ability to reshare public feed posts as quote tweet-style Stories and the addition of licensed music soundtracks. While writing this article, I even discovered a new Facebook Stories option called Blur that lets you shroud a moving subject with a dream-like haze, as demonstrated with my dumb face here.

The relentless drive to add new options and smooth out performance has paid off. Now Instagram has 400 million daily Stories users, WhatsApp has 450 million and Facebook has 150 million, while Snapchat’s whole app has just 191 million. As Instagram CEO Kevin Systrom admitted about Snapchat, “They deserve all the credit.” Still, it hasn’t had a megahit since Stories and AR puppy masks. The company’s zeal for inventing new ways to socialize is admirable, though not always a sound business strategy.

At first, the Stories war was a race, to copy functionality and invade new markets. Instagram and now Facebook making ephemerality optional for their Stories signals a second phase of the war. The core idea of broadcasting content that disappears after a day has become commoditized and institutionalized. Now the winner will be declared not as who invented Stories, but who perfected them.


Source: https://techcrunch.com/2018/07/09/facebook-stories-highlights/

Snapchat code reveals team-up with Amazon for ‘Camera Search’

Codenamed “Eagle,” Snapchat is building a visual product search feature that delivers users to Amazon’s listings. Buried inside the code of Snapchat’s Android app is an unreleased “Visual Search” feature where you “Press and hold to identify an object, song, barcode, and more! This works by sending data to Amazon, Shazam, and other partners.” Once an object or barcode has been scanned you can “See all results at Amazon.”

Visual product search could make Snapchat’s camera a more general purpose tool for seeing and navigating the world, rather than just a social media maker. It could differentiate Snapchat from Instagram, whose clone of Snapchat Stories now has more than twice the users and a six times faster growth rate than the original. And if Snapchat has worked out an affiliate referrals deal with Amazon, it could open a new revenue stream. That’s something Snap Inc. direly needs after posting a $385 million loss last quarter and missing revenue estimates by $14 million.

TechCrunch was tipped off to the hidden Snapchat code by app researcher Ishan Agarwal. His tips have previously led to TechCrunch scoops about Instagram’s video calling, soundtracks, Focus portrait mode and QR Nametags features that were all later officially launched. Amazon didn’t respond to a press inquiry before publishing time, and it’s unclear if its actively involved in the development of Snapchat visual search or just a destination for its results. Snap Inc. gave TechCrunch a “no comment,” but the company’s code tells the story.

Snapchat first dabbled in understanding the world around you with its Shazam integration back in 2016 that lets you tap and hold to identify a song playing nearby, check it out on Shazam, send it to a friend or follow the artist on Snapchat. Project Eagle builds on this audio search feature to offer visual search through a similar interface and set of partnerships. The ability to identify purchaseable objects or scan barcodes could turn Snapchat, which some view as a teen toy, into more of a utility.

What’s inside Snapchat’s Eagle eye

Snapchat’s code doesn’t explain exactly how the Project Eagle feature will work, but in the newest version of Snapchat it was renamed as “Camera Search.” The code lists the ability to surface “sellers” and “reviews,” “Copy URL” of a product and “Share” or “Send Product” to friends — likely via Snap messages or Snapchat Stories. In characteristic cool kid teenspeak, an error message for “product not found” reads “Bummer, we didn’t catch that!”

Eagle’s visual search may be connected to Snapchat’s “context cards,” which debuted late last year and pull up business contact info, restaurant reservations, movie tickets, Ubers or Lyfts and more. Surfacing within Snapchat a context card of details about ownable objects might be the first step to getting users to buy them… and advertisers to pay Snap to promote them. It’s easy to imagine context cards being accessible for products tagged in Snap Ads as well as scanned through visual search. And Snap already has in-app shopping.

Being able to recognize what you’re seeing makes Snapchat more fun, but it’s also a new way of navigating reality. In mid-2017 Snapchat launched World Lenses that map the surfaces of your surroundings so you can place 3D animated objects like its Dancing Hotdog mascot alongside real people in real places. Snapchat also released a machine vision-powered search feature last year that compiles Stories of user-submitted Snaps featuring your chosen keyword, like videos with “puppies” or “fireworks,” even if the captions don’t mention them.

Snapchat was so interested in visual search that this year, it reportedly held early-stage acquisition talks with machine vision startup Blippar. The talks fell through with the U.K. augmented reality company that has raised at least $99 million for its own visual search feature, but which recently began to implode due to low usage and financing trouble. Snap Inc. might have been hoping to jumpstart its Camera Search efforts.

Snap calls itself a camera company, after all. But with the weak sales of its mediocre v1 Spectacles, the well-reviewed v2 failing to break into the cultural zeitgeist and no other hardware products on the market, Snap may need to redefine what exactly that tag line means. Visual search could frame Snapchat as more of a sensor than just a camera. With its popular use for rapid-fire selfie messaging, it’s already the lens through which some teens see the world. Soon, Snap could be ready to train its eagle eye on purchases, not just faces.

In related Snapchat news:


Source: https://techcrunch.com/2018/07/09/snapchat-camera-search/

Snapchat code reveals team-up with Amazon for “Camera Search”

Codenamed “Eagle”, Snapchat is building a visual product search feature that deliver users to Amazon’s listings. Buried inside the code of Snapchat’s Android app is an unreleased “Visual Search” feature where you “Press and hold to identify an object, song, barcode, and more! This works by sending data to Amazon, Shazam, and other partners.” Once an object or barcode has been scanned you can “See all results at Amazon”.

Visual product search could make Snapchat’s camera a more general purpose tool for seeing and navigating the world, rather than just a social media maker. It could differentiate Snapchat from Instagram, whose clone of Snapchat Stories now has over twice the users and a six times faster growth rate than the original. And if Snapchat has worked out an affiliate referrals deal with Amazon, it could open a new revenue stream. That’s something Snap Inc direly needs after posting a $385 million loss last quarter and missing revenue estimates by $14 million.

TechCrunch was tipped off to the hidden Snapchat code by app researcher Ishan Agarwal. His tips have previously led to TechCrunch scoops about Instagram’s video calling, soundtracks, Focus portrait mode, and QR Nametags features that were all later officially launched. Amazon didn’t respond to a press inquiry before publishing time. Snap Inc gave TechCrunch a “no comment”, but the company’s code tells the story.

Snapchat first dabbled in understanding the world around you with its Shazam integration back in 2016 that lets you tap and hold to identify a song playing nearby, check it out on Shazam, send it to a friend, or follow the artist on Snapchat. Project Eagle builds on this audio search feature to offer visual search through a similar interface and set of partnerships. The ability to identify purchaseable objects or scan barcodes could turn Snapchat, which some view as a teen toy, into more of a utility.

What’s Inside Snapchat’s Eagle Eye

Snapchat’s code doesn’t explain exactly how the Project Eagle feature will work, but in the newest version of Snapchat it was renamed as “Camera Search”. The code lists the ability to surface “sellers”, and “reviews”, “Copy URL” of a product, and “Share” or “Send Product” to friends — likely via Snap messages or Snapchat Stories. In characteristic cool kid teenspeak, an error message for “product not found” reads “Bummer, we didn’t catch that!”

Eagle’s visual search may be connected to Snapchat’s “context cards” which debuted late last year and pull up business contact info, restaurant reservations, movie tickets, Ubers or Lyfts, and more. Surfacing a context card within Snapchat of details about ownable objects might be the first step to getting users to buy them…and advertisers to pay Snap to promote them. It’s easy to imagine context cards being accessible for products tagged in Snap Ads as well as scanned through visual search. And Snap already has in-app shopping.

Being able to recognize what you’re seeing makes Snapchat more fun, but it’s also a new way of navigating reality. In mid-2017 Snapchat launched World Lenses that map the surfaces of your surroundings so you can place 3D animated objects like its Dancing Hotdog mascot alongside real people in real places. Snapchat also released a machine vision-powered search feature last year that compiles Stories of user submitted Snaps featuring your chosen keyword, like videos with “puppies” or “fireworks” even if the captions don’t mention them.

Snapchat was so interested in visual search that this year, it reportedly held early stage acquisition talks with machine vision startup Blippar. The talks fell through with the UK augmented reality company that has raised at least $99 million for its own visual search feature, but which recently began to implode due to low usage and financing trouble. Snap Inc might have been hoping to jumpstart its Camera Search efforts.

Snap calls itself a Camera Company, after all. But with the weak sales of its mediocre v1 Spectacles, the well-reviewed v2 failing to break into the cultural zeitgeist, and no other hardware products on the market, Snap may need to redefine what exactly that tag line means. Visual Search could frame Snapchat as more of a sensor than just a camera. With its popular use for rapid-fire selfie messaging, it’s already the lens through which some teens see the world. Soon, Snap could be ready to train its eagle eye on purchases, not just faces.

In related Snapchat news:


Source: https://techcrunch.com/2018/07/09/snapchat-camera-search/

The Great British Hack-Off summer festival hackathon will aim at Brexit

It’s very hard to know what the effects of Brexit are “on the ground”. Local news no longer has much of a business model to concentrate on specific subjects or campaigns. Social media is a mess of local facebook groups which only locals can see. MPs often ignore email / online campaigns from constituents.

The Great British Hack-Off aims to address this. In a 2-day intensive, overnight “hackathon” it aims to get a groundswell of interest in helping to improve local communities and economies and connect people with their decision makers.

It will be held by Tech For UK (Twitter, Hashtag:#GBhackoff, Instagram,
Facebook) the tech industry body calling for a meaningful people’s vote on Brexit, with the option to Remain, and anti-Brexit group Best For Britain .

Anyone interested can apply to attend the event via this form.

The Great British Hack-Off will ask a number of questions and try to build products to address the answers.

Are local community projects, some formerly funded by the EU, still going? Are they being replaced? What about local factories, businesses? What about health Services? Are local or central governments stepping in to help, or are people’s concerns being ignored? Is European and other foreign investment ebbing away from local communities or is it being replaced? Are local news sources sharing what is going on?

What are the human stories? How can social media and video be used to tell those stories best?

The Great British Hack-Off will be a festival of tech and creativity to address these issues.

Tech For UK says the ultimate goal will be to engage the tech community to help Best for Britain connect people in local communities to the information they need on Brexit and, in turn, connect them to their decision makers and MPs. Attendees to the Hackathon will also be able to work on their own projects and ideas related to Brexit.

Tech For UK says this will be the first event in a series, to be continued at other cities around the UK, not just in London.

Structured like a “Hackathon” it will be held at a central London venue, bringing together engineers, designers, storytellers, marketers, data scientists, designers, artists, journalists / PR / media people, analytics experts and social media influencers to work on these problems.

Participants will be selected from applications and given full instructions about the event.

They say there will be capacity for 120 people and the opportunity to stay over-night at the hackathon. Food and beverages will be provided.


Source: https://techcrunch.com/2018/07/09/the-great-british-hack-off-summer-festival-hackathon-will-aim-at-brexit/

Twitter’s efforts to suspend fake accounts have doubled since last year

Bots, your days of tweeting politically divisive nonsense might be numbered. The Washington Post reported Friday that in the last few months the company has aggressively suspended accounts in an effort to stem the spread of disinformation running rampant on its platform.

The Washington Post reports that Twitter suspended as many as 70 million accounts between May and June of this year, with no signs of slowing down in July. According to data obtained by the Post, the platform suspended 13 million accounts during a weeklong spike of bot banning activity in mid-May.

Sources tell the Post that the uptick in suspensions is tied to the company’s efforts to comply with scrutiny from the Congressional investigation into Russian disinformation on social platforms. The report adds that Twitter investigates bots and other fake accounts through an internal project known as “Operation Megaphone” through which it buys suspicious accounts and then investigates their connections.

Twitter declined to provide additional information about the Washington Post report but pointed us to a blog post from last week in which it disclosed other numbers related to its bot hunting efforts. In May of 2018, Twitter identified more than 9.9 million suspicious accounts — triple its efforts in late 2017.

Chart via Twitter

When Twitter identifies an account that it deems suspicious it then “challenges” that account, giving legitimate Twitter users an opportunity to prove their sentience by confirming a phone number. When an account fails this test it gets the boot, while accounts that pass are reinstated.

As Twitter noted in its recent blog post, bots can make users look good by artificially inflating follower counts.

“As a result of these improvements, some people may notice their own account metrics change more regularly,” Twitter warned. The company noted that cracking down on fake accounts means that “malicious actors” won’t be able to promote their own content and accounts as easily by inflating their own numbers. Kicking users off a platform, fake or not, is a risk for a company that regularly reports its monthly active users, though only a temporary one.

As the report notes, at least one insider expects Twitter’s Q2 active user numbers to dip, reflecting its shift in enforcement. Still, any temporary user number setback would prove nominal for a platform that should focus on healthy user growth. Facebook is facing a similar reckoning as a result of the Russian bot scandal, as the company anticipates user engagement stats to dip as it moves to emphasize quality user experiences over juiced up quarterly numbers. In both cases, it’s a worthy tradeoff.


Source: https://techcrunch.com/2018/07/06/twitter-bots-numbers-disinformation-washington-post/

Desktop, Mobile, or Voice? (D) All of the Above – Whiteboard Friday

Posted by Dr-Pete

We’re facing more and more complexity in our everyday work, and the answers to our questions are about as clear as mud. Especially in the wake of the mobile-first index, we’re left wondering where to focus our optimization efforts. Is desktop the most important? Is mobile? What about the voice phenomenon sweeping the tech world?

As with most things, the most important factor is to consider your audience. People aren’t siloed to a single device — your optimization strategy shouldn’t be, either. In today’s Whiteboard Friday, Dr. Pete soothes our fears about a multi-platform world and highlights the necessity of optimizing for a journey rather than a touchpoint.

https://fast.wistia.net/embed/iframe/d2typdg040?seo=false&videoFoam=true

https://fast.wistia.net/assets/external/E-v1.js

Desktop, Mobile, or Voice? All of the above.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hey, everybody. It’s Dr. Pete here from Moz. I am the Marketing Scientist here, and I flew in from Chicago just for you fine people to talk about something that I think is worrying us a little bit, especially with the rollout of the mobile index recently, and that is the question of: Should we be optimizing for desktop, for mobile, or for voice? I think the answer is (d) All of the above. I know that might sound a little scary, and you’re wondering how you do any of these. So I want to talk to you about some of what’s going on, some of our misconceptions around mobile and voice, and some of the ways that maybe this is a little easier than you think, at least to get started.

The mistakes we make

So, first of all, I think we make a couple of mistakes. When we’re talking about mobile for the last few years, we tend to go in and we look at our analytics and we do this. These are made up. The green numbers are made up or the blue ones. We say, “Okay, about 90% of my traffic is coming from desktop, about 10% is coming from mobile, and nothing is coming from voice. So I’m just going to keep focusing on desktop and not worry about these other two experiences, and I’ll be fine.” There are two problems with this:

Self-fulfilling prophecy

One is that these numbers are kind of a self-fulfilling prophecy. They might not be coming to your mobile site. You might not be getting those mobile visitors because your mobile experience is terrible. People come to it and it’s lousy, and they don’t come back. In the case of voice, we might just not be getting that data yet. We have very little data. So this isn’t telling us anything. All this may be telling us is that we’re doing a really bad job on mobile and people have given up. We’ve seen that with Moz in the past. We didn’t adopt to mobile as fast as maybe we should have. We saw that in the numbers, and we argued about it because we said, “You know what? This doesn’t really tell us what the opportunity is or what our customers or users want. It’s just telling us what we’re doing well or badly right now, and it becomes a self-fulfilling prophecy.”

Audiences

The other mistake I think we make is the idea that these are three separate audiences. There are people who come to our site on desktop, people who come to our site on mobile, people who come to our site on voice, and these are three distinct groups of people. I think that’s incredibly wrong, and that leads to some very bad ideas and some bad tactical decisions and some bad choices.

So I want to share a couple of stats. There was a study Google did called The Multiscreen World, and this was almost six years ago, 2012. They found six years ago that 65% of searchers started a search on their smartphones. Two-thirds of searchers started on smartphones six years ago. Sixty percent of those searches were continued on a desktop or laptop. Again, this has been six years, so we know the adoption rate of mobile has increased. So these are not people who only use desktop or who only use mobile. These are people on a journey of search that move between devices, and I think in the real world it looks more something like this right now.

Another stat from the series was that 88% of people said that they used their smartphone and their TV at the same time. This isn’t shocking to you. You sit in front of the TV with your phone and you sit in front of the TV with your laptop. You might sit in front of the TV with a smartwatch. These devices are being used at the same time, and we’re doing more searches and we’re using more devices. So one of these things isn’t replacing the other.

The cross-device journey

So a journey could look something like this. You’re watching TV. You see an ad and you hear about something. You see a video you like. You go to your phone while you’re watching it, and you do a search on that to get more information. Then later on, you go to your laptop and you do a bit of research, and you want that bigger screen to see what’s going on. Then at the office the next day, you’re like, “Oh, I’ll pull up that bookmark. I wanted to check something on my desktop where I have more bandwidth or something.” You’re like, “Oh, maybe I better not buy that at work. I don’t want to get in trouble. So I’m going to home and go back to my laptop and make that purchase.” So this purchase and this transaction, this is one visitor on this chain, and I think we do this a lot right now, and that’s only going to increase, where we operate between devices and this journey happens across devices.

So the challenge I would make to you is if you’re looking at this and you’re saying, “Only so many percent of our users are on mobile. Our mobile experience doesn’t matter that much. It’s not that important. We can just live with the desktop people. That’s enough. We’ll make enough money.” If they’re really on this journey and they’re not segmented like this, and this chain, you break it, what happens? You lose that person completely, and that was a person who also used desktop. So that person might be someone who you bucketed in your 90%, but they never really got to the device of choice and they never got to the transaction, because by having a lousy mobile experience, you’ve broken the chain. So I want you to be aware of that, that this is the cross-device journey and not these segmented ideas.

Future touchpoints

This is going to get worse. This is going to get scarier for us. So look at the future. We’re going to be sitting in our car and we’re going to be listening — I still listen to CDs in the car, I know it’s kind of sad — but you’re going to be listening to satellite radio or your Wi-Fi or whatever you have coming in, and let’s say you hear a podcast or you hear an author and you go, “Oh, that person sounds interesting. I want to learn more about them.” You tell your smartwatch, “Save this search. Tell me something about this author. Give me their books.” Then you go home and you go on Google Home and you pull up that search, and it says, “Oh, you know what? I’ve got a video. I can’t play that because obviously I’m a voice search device, but I can send that to Chromecast on your TV.” So you send that to your TV, and you watch that. While you’re watching the TV, you’ve got your phone out and you’re saying, “Oh, I’d kind of like to buy that.” You go to Amazon and you make that transaction.

So it took this entire chain of devices. Again now, what about the voice part of this chain? That might not seem important to you right now, but if you break the chain there, this whole transaction is gone. So I think the danger is by neglecting pieces of this and not seeing that this is a journey that happens across devices, we’re potentially putting ourselves at much higher risk than we think.

On the plus side

I also want to look at sort of the positive side of this. All of these devices are touchpoints in the journey, and they give us credibility. We found something interesting at Moz a few years ago, which was that our sale as a SaaS product on average took about three touchpoints. People didn’t just hit the Moz homepage, do a free trial, and then buy it. They might see a Whiteboard Friday. They might read our Beginner’s Guide. They might go to the blog. They might participate in the community. If they hit us with three touchpoints, they were much more likely to convert.

So I think the great thing about this journey is that if you’re on all these touchpoints, even though to you that might seem like one search, it lends you credibility. You were there when they ran the search on that device. You were there when they tried to repeat that search on voice. The information was in that video. You’re there on that mobile search. You’re there on that desktop search. The more times they see you in that chain, the more that you seem like a credible source. So I think this can actually be good for us.

The SEO challenge

So I think the challenge is, “Well, I can’t go out and hire a voice team and a mobile team and do a design for all of these things. I don’t want to build a voice app. I don’t have the budget. I don’t have the buy-in.” That’s fine.
One thing I think is really great right now and that we’re encouraging people to experiment with, we’ve talked a lot about featured snippets. We’ve talked about these answer boxes that give you an organic result. One of the things Google is trying to do with this is they realize that they need to use their same core engine, their same core competency across all devices. So the engine that powers search, they want that to run on a TV. They want that to run on a laptop, on a desktop, on a phone, on a watch, on Goggle Home. They don’t want to write algorithms for all of these things.

So Google thinks of their entire world in terms of cards. You may not see that on desktop, but everything on desktop is a card. This answer box is a card. That’s more obvious. It’s got that outline. Every organic result, every ad, every knowledge panel, every news story is a card. What that allows Google to do, and will allow them to do going forward, is to mix and match and put as many pieces of information as it makes sense for any given device. So for desktop, that might be a whole bunch. For mobile, that’s going to be a vertical column. It might be less. But for a watch or a Google Glass, or whatever comes after that, or voice, you’re probably only going to get one card.

But one great thing right now, from an SEO perspective, is these featured snippets, these questions and answers, they fit on that big screen. We call it result number zero on desktop because you’ve got that box, and you’ve got a bunch of stuff underneath it. But that box is very prominent. On mobile, that same question and answer take up a lot more screen space. So they’re still a SERP, but that’s very dominant, and then there’s some stuff underneath. On voice, that same question and answer pairing is all you get, and we’re seeing that a lot of the answers on voice, unless they’re specialty like recipes or weather or things like that, have this question and answer format, and those are also being driven by featured snippets.

So the good news I think, and will hopefully stay good news going forward, is that because Google wants all these devices to run off that same core engine, the things you do to rank well for desktop and to be useful for desktop users are also going to help you rank on mobile. They’re going to help you rank on voice, and they’re going to help you rank across all these devices. So I want you to be aware of this. I want you to try and not to break that chain. But I think the things we’re already good at will actually help us going forward in the future, and I’d highly encourage you to experiment with featured snippets to see how questions and answers appear on mobile and to see how they appear on Google Home, and to know that there’s going to be an evolution where all of these devices benefit somewhat from the kind of optimization techniques that we’re already good at hopefully.

Encourage the journey chain

So I also want to say that when you optimize for answers, the best answers leave searchers wanting more. So what you want to do is actually encourage this chain, encourage people to do more research, give them rich content, give them the kinds of things that draw them back to your site, that build credibility, because this chain is actually good news for us in a way. This can help us make a purchase. If we’re credible on these devices, if we have a decent mobile experience, if we come up on voice, that’s going to help us really kind of build our brand and be a positive thing for us if we work on it.

So I’d like you to tell me, what are your fears right now? I think we’re a little scared of the mobile index. What are you worried about with voice? What are you worried about with IoT? Are you concerned that we’re going to have to rank on our refrigerators, and what does that mean? So it’s getting into science fiction territory, but I’d love to talk about it more. I will see you in the comment section.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Source: https://moz.com/blog/desktop-mobile-voice

Tinder Loops, the dating app’s new video feature, rolls out globally

Tinder Loops, the recently announced video feature from Tinder, is today rolling out globally.

Tinder has been testing this feature in Canada and Sweden since April, when it was first announced, and has rolled out to a few other markets since then.

Today, Loops are available to Tinder users across the following markets: Japan, United Kingdom, United States, France, Korea, Canada, Australia, Germany, Italy, Netherlands, Russia, Sweden, Belgium, Denmark, Iceland, Ireland, Kuwait, New Zealand, Norway, Qatar, Saudi Arabia, Singapore, Switzerland, Taiwan, Thailand and United Arab Emirates.

Loops are two-second, looping videos that can be posted to users’ profiles. Users can’t shoot Tinder Loops from within the app, but rather have to upload and edit existing videos in their camera roll or upload a Live Photo from an iOS device.

Tinder is also expanding the amount of images you can post to your profile to nine, in order to make room for Loops without displacing existing photos.

Given that Tinder has been testing the feature since early April, the company now has more data around how Tinder Loops have been working out for users. For example, users who added a Loop to their profile saw that their average conversation length went up by 20 percent. The feature seems to be particularly effective in Japan — Loops launched there in June — with users receiving an average of 10 percent more right swipes if they had a Loop in their profile.

In the age of Instagram and Tinder, people have used photos to represent themselves online. But, with all the editing tools out there, that also means that photos aren’t always the most accurate portrayal of personality or appearance. Videos on Tinder offer a new way to get to know someone for who they are.


Source: https://techcrunch.com/2018/07/05/tinder-loops-the-dating-apps-new-video-feature-rolls-out-globally/

Wikipedia goes dark in Spanish, Italian ahead of key EU vote on copyright

Wikipedia’s Italian and Spanish language versions have temporarily shut off access to their respective versions of the free online encyclopedia in Europe to protest against controversial components of a copyright reform package ahead of a key vote in the EU parliament tomorrow.

The protest follows a vote by the EU parliament’s legal affairs committee last month which backed the reforms — including the two most controversial elements: Article 13, which makes platforms directly liable for copyright infringements by their users — pushing them towards pre-filtering all content uploads, with all the associated potential chilling effects for free expression; and Article 11, which targets news aggregator business models by creating a neighboring right for snippets of journalistic content — aka ‘the link tax’, as critics dub it.

Visitors to Wikipedia in many parts of the EU (and further afield) are met with a banner which urges them to defend the open Internet against the controversial proposal by calling their MEP to voice their opposition to a measure critics describe as ‘censorship machines’, warning it will “weaken the values, culture and ecosystem on which Wikipedia is based”.

Clicking on a button to ‘call your MEP’ links through to anti-Article 13 campaign website, saveyourinternet.eu, where users can search for the phone number of their MEP and/or send an email to protest against the measure. The initiative is backed by a large coalition of digital and civil rights groups  — including the EFF, the Open Rights Group, and the Center for Democracy & Technology.

In a longer letter to visitors explaining its action, the Spanish Wikipedia community writes that: “If the proposal were approved in its current version, actions such as sharing a news item on social networks or accessing it through a search engine would become more complicated on the Internet; Wikipedia itself would be at risk.”

The Spanish language version of Wikipedia will remain dark throughout the EU parliament vote — which is due to take place at 10 o’clock (UTC) on July 5.

“We want to continue offering an open, free, collaborative and free work with verifiable content. We call on all members of the European Parliament to vote against the current text, to open it up for discussion and to consider the numerous proposals of the Wikimedia movement to protect access to knowledge; among them, the elimination of articles 11 and 13, the extension of the freedom of panorama to the whole EU and the preservation of the public domain,” it adds.

The Italian language version of Wikipedia went dark yesterday.

While the protest banners about the reform are appearing widely across Wikipedia, the decisions to block out encyclopedia content are less widespread — and are being taken by each local community of editors.

https://platform.twitter.com/widgets.js

https://platform.twitter.com/widgets.js

As you’d expect, Wikipedia founder Jimmy Wales has been a very vocal critic of Article 13 — including lashing out at whoever was in control of the European Commission’s Twitter feed yesterday when they tried to suggest that online encyclopedias will not be affected by the proposal — by suggesting they would not be “considered” to be giving access to “large amounts of unauthorised protected content” by claiming most of their content would fall outside the scope of the law because it’s covered by Creative Commons licenses. (An interpretation of the proposed rules that anti-Article 13 campaigners dispute.)

And the commissioners drafting this portion of the directive do appear to have been mostly intending to regulate YouTube — which has been a target for record industry ire in recent years, over the relatively small royalties paid to artists vs streaming music services.

But critics argue this is a wrongheaded, sledgehammer-to-crack a nut approach to lawmaking — which will have the unintended consequence of damaging free expression and access to information online.

Wales shot back at the EC’s tweet — saying it’s “deeply inappropriate for the European Commission to be lobbying publicly and misleading the public in this way”.

https://platform.twitter.com/widgets.js

A little later in the same Twitter thread, as more users had joined the argument, he added: “The Wikipedia community is not so narrow minded as to let the rest of the Internet suffer just because we are big enough that they try to throw us a bone. Justice matters.”

The EU parliament will vote as a whole tomorrow — when we’ll find out whether or not MEPs have been swayed by this latest #SaveYourInternet campaign.

https://platform.twitter.com/widgets.js


Source: https://techcrunch.com/2018/07/04/wikipedia-goes-dark-in-spanish-italian-ahead-of-key-eu-vote-on-copyright/