Monthly Archives: August 2019
Download an easy to use ppc performance troubleshooting template that will organize your analysis and allow for easy collaboration among teams.
Read more at PPCHero.com
Telegram, a popular instant messaging app, has introduced a new feature to give group admins on the app better control over how members engage, the latest in a series of interesting features it has rolled out in recent months to expand its appeal.
The feature, dubbed Slow Mode, allows a group administrator to dictate how often a member could send a message in the group. If implemented by a group, members who have sent a text will have to wait between 30 seconds to as long as an hour before they can say something again in that group.
The messaging platform, which had more than 200 million monthly active users as of early 2018, said the new feature was aimed at making conversations in groups “more orderly” and raising the “value of each individual message.” It suggested admins to “keep [the feature] on permanently, or toggle as necessary to throttle rush hour traffic.”
Tech platforms including WhatsApp are grappling with containing the spread of misinformation on their messaging services. Though Telegram has largely been immune to such controversies, it has its fair share of issues.
WhatsApp has enforced limits on how often a user could forward a text message and is using machine learning techniques to weed out fraudulent users during the sign up procedure itself.
Shivnath Thukral, Director of Public Policy for Facebook in India and South Asia, said at a conference this month that virality of content has dropped by 25% to 30% on WhatsApp since the messaging platform imposed limits on forwards.
Telegram isn’t marketing the “Slow Mode” as a way to tackle the spread of false information, though. Instead, it says the feature would give users more “peace of mind.” Indeed, unlike WhatsApp, which allows up to 256 users to be part of a group, up to a whopping 200,000 users can join a Telegram group.
this new Telegram groups feature is so interesting pic.twitter.com/763mHGmZ0u
— freia lobo (@freialobo) August 10, 2019
On a similar tone, Telegram has also added an option that will enable users to send a message without invoking a sound notification at the recipient’s end. “Simply hold the Send button to have any message or media delivered without sound,” the app maker said. “Your recipient will get a notification as usual, but their phone won’t make a sound – even if they forgot to enable the Do Not Disturb mode.”
Telegram has also introduced a range of other small features such as the ability for group owners to add custom titles for admins. Videos on the app now display thumbnail previews when a user scrubs through them, making it easier to them to find the right moment. Like YouTube, users on Telegram too can now share a video that jumps directly at a certain timestamp. Users can also animate their emojis now — if they are into that sort of thing.
In June, Telegram introduced a number of location-flavored features to allow users to quickly exchange contact details without needing to type in digits.
A security researcher has demonstrated how to force everyday commercial speakers to emit harmful sounds.
Feed: All Latest
By-the-minute car rental service Car2go is raising its rates for short trips under the guise of variable pricing, the company announced to its users today. As we’ve seen with other variably priced services like delivery and ride hailing, in practice this means you never really know what it will cost but will have little choice but to pay.
In an email to users of its service, Car2go said that as a result of “constantly evaluating our product, packages, and pricing strategies” it had arrived at the new system, under which price will depend on time, location and day. The new cost structure takes effect next month.
For Car2go users, this will generally mean paying more. The company highlighted a new cheaper possible per-minute rate of 35 cents, significantly lower than the current $ 0.45 rate. But it’s easy to guess when that lower rate will be available: “times, locations and days” that no one is using the service. Meanwhile, it’s also possible to encounter a new higher per-minute rate of up to 49 cents when cars are in demand or in a high-use location.
Blocks of time from half an hour to four hours are all increasing in price: The current flat rates are now floor rates, with the possibility you’ll be paying as much as a third more than before. For example, a two-hour block currently costs $ 29; soon it will cost somewhere between $ 30 and $ 39. Again, you won’t know until you open the app to check it out, at which point you’re probably already committed.
Day-length packages are actually cheaper under the new system, but no longer include miles, so while a 24-hour pass used to be $ 79, now it’s $ 70 — but at 19 cents per mile, you’ll be in the red after less than 50 miles. And the price only goes up from there. Still, it’s conceivable you’ll pay less for a two or three-day rental if you’re not actually going anywhere distant, but just need a car for the weekend.
A newly instituted zone-based charge and refund system punishes drivers for leaving the city center and rewards those at the periphery for driving back toward heavy usage areas. There’s a $ 5 charge if you leave the central zone, and $ 5 refund — or the price of the trip, if less — if you bring a car in from the outer one. (Consult your local Car2go to see what the zones are in your city.)
Count the cards here and you can see the house always wins. If you’re going out, the full $ 5 fee always applies. If you’re coming in, it will be very difficult to nail that $ 5 ride — go under and Car2go is reimbursing less than the $ 5 (and thus comes out ahead), go over and you end up paying money anyway. It’s just one of those clever little traps businesses set up.
You can see the full changes in the chart below:
Oh, and your first 200 trips this calendar year have an additional $ 1 fee. You’re welcome!
In case you can’t tell, this is bad news for consumers, though it would be too much to expect that these prices would stay stable for years. But variable pricing is fundamentally anti-consumer because of a lack of transparency under which the companies controlling it can pull all kinds of shenanigans. Sadly, that makes it a great choice for the bottom line.
These unwelcome changes come six months after Car2go joined the BMW-Daimler joint venture Share Now, which has a variety of car-share services around the world it intends to unify under a single brand soon (it already killed ReachNow, rather abruptly). Apparently larger scale and reduced competition don’t actually lead to lower prices — unfortunate for their customers. But overall the floating car-share services are an important one. Just not as cheap as they used to be.
As companies collect increasingly large amounts of data about customers, the end game is about improving the customer experience. It’s a term we’re hearing a lot of these days, and we are going to be discussing that very topic with Amit Ahuja, Adobe’s vice president of ecosystem development, next month at TechCrunch Sessions: Enterprise in San Francisco. Grab your early-bird tickets right now — $ 100 savings ends today!
Customer experience covers a broad array of enterprise software and includes data collection, analytics and software. Adobe deals with all of this, including the Adobe Experience Platform for data collection, Adobe Analytics for visualization and understanding and Adobe Experience Cloud for building applications.
The idea is to begin to build an understanding of your customers through the various interactions you have with them, and then build applications to give them a positive experience. There is a lot of talk about “delighting” customers, but it’s really about using the digital realm to help them achieve what they want as efficiently as possible, whatever that means to your business.
Ahuja will be joining TechCrunch’s editors, along with Qualtrics chief experience officer Julie Larson-Green and Segment CEO Peter Reinhardt to discuss the finer points of what it means to build a customer experience, and how software can help drive that.
Ahuja has been with Adobe since 2005 when he joined as part of the $ 3.4 billion Macromedia acquisition. His primary role today involves building and managing strategic partnerships and initiatives. Prior to this, he was the head of Emerging Businesses and the GM of Adobe’s Data Management Platform business, which focuses on advertisers. He also spent seven years in Adobe’s Corporate Development Group, where he helped complete the acquisitions of Omniture, Scene7, Efficient Frontier, Demdex and Auditude.
Amit will be joining us on September 5 in San Francisco, along with some of the biggest influencers in enterprise, including Bill McDermott from SAP, Scott Farquhar from Atlassian, Aparna Sinha from Google, Wendy Nather from Duo Security, Aaron Levie from Box and Andrew Ng from Landing AI.
Early-bird savings end today, August 9. Book your tickets today and you’ll save $ 100 before prices go up.
Bringing a group? Book our 4+ group tickets and you’ll save 20% on the early-bird rate. Bring the whole squad here.
Part two of our article on “Robots.txt best practice guide + examples” talks about how to set up your newly created robots.txt file.
If you are not sure how to create your own robots.txt file or are not sure what one is, head on over to our first part of this article series, “Robots.txt best practice guide + examples” were you will be able to learn the ins and outs of what a robots.txt file is and how to properly set one up. Even if you have been in the SEO game for some time, the article offers a great refresher course.
How to add a robots.txt file to your site
A Robots.txt file is typically stored in the root of your website for it to be found. For example, if your site were https://www.mysite.com, your robots.txt file would be found here: https://www.mysite.com/robots.txt. By placing the file in the main folder or root directory of your site, you will then be able to control the crawling of all urls under the https://www.mysite.com domain.
It’s also important to know that a robots.txt is case sensitive, so be sure to name the file “robots.txt” and not something like Robots.txt, ROBOTS.TXT, robots.TXT, or any other variation with capital letters.
Why a robots.txt file is important
A Robots.txt is just a plain text file, but that “plain” text file is extremely important as it is used to let the search engines know exactly where they can and cannot go on your site. This is why it is an extremely import part of your website.
Once you have added your brand new robots.txt file to your site or are simply just making updates to your current robots.txt file, it’s important to test it out to make sure that it is working the way that you want.
While there are lots of sites and different tools that you can use to test out your robots.txt file, you can still use Google’s robots.txt file tester in the old version of Search Console. Simply log in to your site’s Search Console, scroll down to the bottom of the page and click on → Go to old version
Then click on Crawl → robots.txt tester
From here, you’ll be able to test your sites robots.txt file by adding the code from your file to the box and then clicking on the “test” button.
If all goes well, the red test button should now be green and should have switched to “Allowed”, once that happens, it means that your new created or modified robots.txt file is valid. You can now upload your robots.txt file to your sites root directory.
Google updates to robots.txt file standards effective Sept 1
Google recently announced that changes are coming to how Google understands some of the unsupported directives in your robots.txt file.
Effective September 1, Google will stop supporting unsupported and unpublished rules in the robots exclusion protocol. That means that Google will no longer support robots.txt files with the noindex directive listed within the file.
If you have used the noindex directive in your robots.txt file in the past to control crawling, there are a number of alternative options that you can use:
Noindex in robots meta tags: Both of these tags are supported in both the HTTP response headers and in HTML. However, the noindex directive is the most effective way to remove URLs from the index when crawling is allowed.
404 and 410 HTTP status codes
Both of these status codes mean that the page does not exist, which will drop any URLs that return this code from Google’s index once they’re crawled and processed.
Adding password protection is a great way to block Google from seeing and crawling pages on your site or your site entirely (thinking about a dev version of the site) Hiding a page behind a login will generally remove it from Google’s index as they are not able to fill in the required information to move forward to see what’s behind the login. You can use the Subscription and paywalled content markup for that type of content, but that’s a whole other topic for another time.
Disallow in robots.txt
Search engines can only index pages that they know about (can find and crawl), so by blocking the page or pages from being crawled usually means its content won’t be indexed. It’s important to remember that Google may still find and index those pages, by other pages linking back to them.
Search Console Remove URL tool
The search console removal tool offers a quick and easy way for you to be able to remove a URL temporarily from Google’s search results. We say temporarily because this option is only valid for about 90 days. After that, your url can again appear in Google’s search results.
To make your removal permanent, you will need to follow the steps mentioned above
- Block access to the content (requiring a password)
- Add a noindex meta tag
- Create a 404 or 410 http status code
Making small tweaks can sometimes have big impacts on your sites SEO and by using a robots.txt file is one of those tweaks that can make a significant difference.
Remember that your robots.txt file must be uploaded to the root of your site and must be called “robots.txt” for it to be found. This little text file is a must have for every website and adding a robots.txt file to the root folder of your site is a very simple process
I hope this article helped you learn how to add a robots.txt file to your site, as well as the importance of having one. If you want to learn more about robots.txt files and you haven’t done so already, you can read part one of this article series “Robots.txt best practice guide + examples.”
What’s your experience creating robots.txt files?
The post Robots.txt best practice guide, part 2: Setting up your robots.txt file appeared first on Search Engine Watch.
If you’re planning your 2020 marketing budget, there are a lot of paid media trends you should be considering.
Read more at PPCHero.com
Hyp3r, an apparently trusted marketing partner of Facebook and Instagram, has been secretly collecting and storing location and other data on millions of users, against the policies of the social networks, Business Insider reported today. It’s hard to see how it could do this for years without intervention by the platforms except if the latter were either ignorant or complicit.
After BI informed Instagram, the company confirmed that Hyp3r (styled HYP3R) had violated its policies and has now been removed from the platform. In a statement to TechCrunch, a Facebook spokesperson confirmed the report, saying:
HYP3R’s actions were not sanctioned and violate our policies. As a result, we’ve removed them from our platform. We’ve also made a product change that should help prevent other companies from scraping public location pages in this way.
The company started several years ago as a platform via which advertisers could target users attending a given event, like a baseball game or concert. It used Instagram’s official API to hoover up data originally, the kind of data-gathering that has been happening for years by unsavory firms in tech, most infamously Cambridge Analytica.
The idea of getting an ad because you’re at a ball game isn’t so scary, but if the company maintains a persistent record not just of your exact locations, but objects in your photos and types of places you visit, in order to combine that with other demographics and build a detailed shadow profile… well, that’s a little scary. And so Hyp3r’s business model evolved.
Unfortunately, the API was severely restricted in early 2018, limiting Hyp3r’s access to location and user data. Although there were unconfirmed reports that this led to layoffs at the company around the time, the company seems to have survived (and raised millions shortly afterwards) not by adapting its business model, but by sneaking around the apparently quite minimal barriers Instagram put in place to prevent location data from being scraped.
Some of this was done by taking advantage of Instagram’s Location pages, which would serve up public accounts visiting them to anyone who asked, logged in or not. (This was one of the features turned off today by Instagram.)
According to BI’s report, Hyp3r built tools to circumvent limitations on both location collection and saving of personal accounts’ stories — content meant to disappear after 24 hours. If a user posted anything at one of thousands of locations and regions monitored by Hyp3r, their data would be sucked up and added to their shadow profile.
To be clear, it only collected information from public stories and accounts. Naturally these people opted out of a certain amount of privacy by choosing a public account, but as the Cambridge Analytica case and others have shown, no one expects or should have to expect that their data is being secretly and systematically assembled into a personal profile by a company they’ve never heard of.
Facebook and Instagram, however, had definitely heard of Hyp3r. In fact, Hyp3r could until today be found in the official Facebook Marketing Partners directory, a curated list of companies it recommends for various tasks and services that advertisers might need.
And Hyp3r has been quite clear about what it is doing, though not about the methods by which it is doing it. It wasn’t a secret that the company was building profiles based around tracking locations and brands — that was presumably what Facebook listed it for. It was only when this report surfaced that Hyp3r had its Facebook Marketing Partner privileges rescinded.
For its part Hyp3r claims to be “compliant with consumer privacy regulations and social network Terms of Services,” and emphasized in a statement that it only accessed public data.
It’s unclear how Hyp3r could exist as a privileged member of Facebook’s stable of recommended companies and simultaneously be in such blatant violation of its policies. If these partners receive even cursory reviews of their products and methods, wouldn’t it have been obvious to any informed auditor that there was no legitimate source for the location and other data that Hyp3r was collecting? Wouldn’t it have been obvious that it was engaging in Automated Data Collection, which is specifically prohibited without Facebook’s permission?
I’ve asked Facebook for more detail on how and when its Marketing Partners are reviewed, and how this seemingly fundamental violation of the prohibition against automated data collection could have gone undetected for so long. This story is developing and may be updated further.
The Daily Crunch is TechCrunch’s roundup of our biggest and most important stories. If you’d like to get this delivered to your inbox every day at around 9am Pacific, you can subscribe here.
The landscape has changed dramatically since the Galaxy Note was first unveiled in 2011, with Samsung pulling the rest of the industry into a world of bigger screens.
Now, the question is how to make the latest updates compelling. With the Note 10 and 10+ (available August 23 at a starting price of $ 950), Samsung is splitting the line into two distinct devices, and it’s getting rid of the headphone jack.
The Live View feature isn’t designed with the idea that you’ll hold up your phone continually as you walk. Instead, in provides quick, easy and super-useful orientation by showing you arrows and big, readable street markers overlaid on the real scene in front of you.
Despite big losses, what made Wall Street happy was Lyft’s optimism for Q3, as well as the full-year 2019.
According to The Hollywood Reporter, the deal is worth $ 200 million. This follows expensive Netflix pacts with other high-profile showrunners, including Ryan Murphy ($ 300 million) and Shonda Rhimes ($ 100 million).
Hyp3r, an apparently trusted marketing partner of Facebook and Instagram, has been secretly collecting and storing location and other data on millions of users, violating the policies of the social networks, according to Business Insider.
Both Dadi and Legacy recently raised funding, hoping to leverage venture capital dollars to become the dominant men’s fertility brand.
Danny Crichton argues that although August is generally considered a black hole for VC, using it effectively for fundraising is perhaps the single most important factor for success in the coming season. (Extra Crunch membership required.)