Why Google might not be able to stop “fake news”

Originally at http://feeds.searchengineland.com/~r/searchengineland/~3/eQOv1ACDcPo/google-might-not-able-stop-fake-news-263850

Google says it wants to prevent fake news from spreading, but the search giant has serious challenges in actually curbing it.

The post Why Google might not be able to stop “fake news” appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.

Need help with your Digital Marketing? Fill out this form and see what we can do you for you and your Business http://nationwideseo.com.au/discovery-page/

New Google AdWords Reports Help Smart Marketers Increase Store Visits by @MattGSouthern

Originally at http://tracking.feedpress.it/link/13962/4838076

Online marketing for a physical store is a challenge that Google hopes to make easier with the release of two new AdWords reports.

The post New Google AdWords Reports Help Smart Marketers Increase Store Visits by @MattGSouthern appeared first on Search Engine Journal.

Need help with your Digital Marketing? Fill out this form and see what we can do you for you and your Business http://nationwideseo.com.au/discovery-page/

What B2B Marketers Can Learn From Their B2C Friends

Originally at http://feedproxy.google.com/~r/socialmediatoday_allposts/~3/B__WcLgbo4E/what-b2b-marketers-can-learn-their-b2c-friends

B2B marketers can learn more than you might think from B2C campaigns. Here are some key points to consider. 

Need help with your Digital Marketing? Fill out this form and see what we can do you for you and your Business http://nationwideseo.com.au/discovery-page/

Automating Technical Reporting for SEO

Originally at http://tracking.feedpress.it/link/9375/4836642

Posted by petewailes

As the web gets more complex, with JavaScript framework and library front ends on websites, progressive web apps, single-page apps, JSON-LD, and so on, we’re increasingly seeing an ever-greater surface area for things to go wrong. When all you’ve got is HTML and CSS and links, there’s only so much you can mess up. However, in today’s world of dynamically generated websites with universal JS interfaces, there’s a lot of room for errors to creep in.

The second problem we face with much of this is that it’s hard to know when something’s gone wrong, or when Google’s changed how they’re handling something. This is only compounded when you account for situations like site migrations or redesigns, where you might suddenly archive a lot of old content, or re-map a URL structure. How do we address these challenges then?

The old way

Historically, the way you’d analyze things like this is through looking at your log files using Excel or, if you’re hardcore, Log Parser. Those are great, but they require you to know you’ve got an issue, or that you’re looking and happen to grab a section of logs that have the issues you need to address in them. Not impossible, and we’ve written about doing this fairly extensively both in our blog and our log file analysis guide.

The problem with this, though, is fairly obvious. It requires that you look, rather than making you aware that there’s something to look for. With that in mind, I thought I’d spend some time investigating whether there’s something that could be done to make the whole process take less time and act as an early warning system.

A helping hand

The first thing we need to do is to set our server to send log files somewhere. My standard solution to this has become using log rotation. Depending on your server, you’ll use different methods to achieve this, but on Nginx it looks like this:

# time_iso8601 looks like this: 2016-08-10T14:53:00+01:00
if ($time_iso8601…

Need help with your Digital Marketing? Fill out this form and see what we can do you for you and your Business http://nationwideseo.com.au/discovery-page/