What Is A Server Log File?

.table-o-contents {
margin-right: 2em;
margin-left:0;
margin-bottom:2em;
}
.table-o-contents li {
display:block;
text-align: left;
margin: 0;
list-style:none;
transition: all 0.5s ease
font-size:1.2rem;
line-height:1rem;
}

.table-o-contents li:hover {
background-color: navy;
transition: all 0.3s ease;
}

.table-o-contents a,
.table-o-contents a:visited {
color:#333;
display:block;
width:100%;
height:100%;
text-decoration: none;
font-face: sans-serif;
}

.table-o-contents a:hover,
.table-o-contents a:focus {
color: #fff;
}

@media screen and ( max-width:767px ) {
.table-o-contents {
float:none;
}
}

If you already know what a log file is, and want to learn how to analyze them for SEO, check out Log File Analysis For SEO.

Server log files are a raw, unfiltered look at traffic to your site. They’re text files stored on your web server. Every time any browser or user-agent, Google included, requests any resource—pages, images, javascript file, whatever—from your server, the server adds a line in the log.

That makes log files giant piles of juicy data.

If you already know how they work and want to analyze them, read my post about log file analysis for SEO. If you know that, too, get a cup of coffee and take care of all those emails in your inbox. You don’t need to read this.

Log File Contents

Here’s a line from Portent’s server log. I edited it to simplify a bit:

11.222.333.44 - - [11/Dec/2018:11:01:28 –0600] “GET /blog/page-address.htm HTTP/1.1” 200 182 “-” “Mozilla/5.0 Chrome/60.0.3112.113”

If you want to know the basics, you can read about the Common Log Format here.

The last bit that starts with “Mozilla” is the user agent. It’s important if you’re analyzing the log file for SEO, or to see what software is accessing your site, or to troubleshoot a specific server problem. The user agent is the type of browser or other software that’s accessing your site. If Googlebot requests a resource, you’ll see a user agent string that includes “GoogleBot.” If Bingbot hits your site, you’ll see a user agent string that includes “BingBot.”

Breaking Down A Log Entry

Here’s the example again:

11.222.333.44 - - [11/Dec/2018:11:01:28 –0600] “GET /blog/page-address.htm HTTP/1.1” 200 182 “-” “Mozilla/5.0 Chrome/60.0.3112.113”

On December 11, 2018, someone using Google Chrome tried to load https://www.portent.com/blog/page-address.htm. The ‘200’ means the server found the file (yay!). Page-address.htm is teeny, weighing in at 182 bytes.

The IP address of the client—the software that requested the file was—11.222.333.44. I put that last because for many reasons it’s not terribly helpful to us marketers.

Again: Every request from every user agent is a line in the log file. Every request. Every. Single. One.

Getting The Logs

That’s the rub. Some technical teams cling to log files, citing security concerns. Some site platforms hide log files so deep in their twisted innards finding them requires an electronic colonoscopy.

But the log files are there. They’re not a security risk. The site developer can zip them up and send them to you. Buy beers, bring chocolate, do whatever you need to do to make friends. Then ask.

If the files are gigantic, ask for a snapshot. A few days or even a few hours is a good start.

What You Can Learn And What You Don’t Need

Log files are data lasagna. They’re yummy. They’re substantial. And they’ll put you to sleep if you overindulge.

I use them to find:

  • Spider traps. Log files give you a great look at how search bots are crawling your site.
  • Spam content. If some hacker dumped a bunch of pages listing porn links on your site, any clicks to those pages appear in the log files.
  • Broken external links. Google eventually gives up crawling broken links from other sites. But people still click them. Track down those busted external links and reclaim some authority.
  • Incorrect server responses. Google Search Console can show some soft 404 errors, but the log file shows you all of them.

You can’t use them to:

  • Get keyword data. Keyword data isn’t just an analytics software problem. Not provided means you can’t find search terms here, either.
  • Track user sessions (usually). Most user session tracking requires javascript. Use a tool like Google Analytics, instead.
  • Track individual user. In theory, you could track visits from Ian Lurie. But it would require much mind-numbing labor.
  • Track rendering times. Log files show requests for resources. They don’t track what happens after the request. If a page renders incorrectly or slowly, it won’t show up here.

You don’t need to use them to:

  • Track conversions. Conversion tracking in log files is like sitting on your tongue. Feasible, but not recommended.
  • Analyze geographic data. You can, but most analytics software shows location data and requires a lot less work.
  • Track click paths through your site. Again, possible, but you can get the data more easily from your analytics software.

My Favorite Log Analysis Tools

Screaming Frog Log File Analyser is my number one choice. It’s a great combination of power and usability, and you can merge log file data with crawls and other data sources.

Splunk is so powerful it terrifies me. But it’s great for managing giant log files in near-real-time.

Apache Log Viewer is free. It has a steeper learning curve than Screaming Frog, but, you know, free.

Log files don’t provide conversion data or session data. That kind of tracking requires cookies and a client-side analytics suite, like Google Analytics. They do provide a record of every website resource requested from every browser and user-agent. That makes them very powerful.

Ian, WTF?

I usually write metaphor-stuffed rants. This is more Wikipedia style because I write about log files all the time. This seems more efficient than adding a “what’s a log file?” section to every post.

The post What Is A Server Log File? appeared first on Portent.

Read More →

Email Marketing an Efficient Way to Promote Business

Most internet marketers today are of the opinion that email marketing is a redundant marketing strategy. These marketers have abandoned this technique and replaced it with social media promotion. In second thought, social media looks sexier than an email. But is social media post truly as powerful as an email? If we look at the fact that its easier for us to send a friend request on Facebook than collect one email address. Then surely email marketing looks less effective. But trust me email marketing is many times more useful than promoting your brand on social media, like Facebook and Twitter.

In this blog post, we’ll elaborate a little more on email marketing.
How Email Marketing Works?

Email marketing is a promotion strategy that uses emails targeted to customers. Emails are sent to potential customers to make them aware of the brand that the company is promoting. The subscriber receives an email which is either promotional or in the form of an Ad. 
Now, many marketing companies include this strategy in their campaigns. Moreover, an increasing number of bloggers are making use of email marketing nowadays. One case scenario is worth a mention: an internet user checking a website notices a field where they are asked to submit their name and email, to receive a free e-Book. This is a basic example of how email marketing works. 
Why Is Email Marketing A Good Marketing Strategy?

Email marketing Iowa start with a technique that helps improve the customers’ or the users’ experience. It’s totally different from the glamorous and flashy social media marketing. It’s a more efficient way to engage customers. It’s even a much better way to reach maximum users. To say the least, it provides a better click-through rate.
The following statistics will make my point more clear:
  • In the year 2013 nearly 3.2 billion new email accounts were created.
  • 95% internet consumers are active users of email for using as a medium of communication.
  • 91% of these consumers check emails that they receive on a daily basis.
Moreover, this is an era of mobile browsing. So, people are easily notified when they receive emails on their mobile devices. On the mobiles, notifications are displayed for easy access to the users. Also, on a mobile phone, the users find it more convenient to check an email than a Facebook or Twitter post. This is since millions of people post content at once, so finding the specific post you liked to say a couple of weeks ago is cumbersome. 
Conclusion
To conclude email marketing is a good way to boost your online presence.

Read More →

3 Convincing Reasons to take Help of Digital Agency

Web Design Full SEO EducationIt is true that these days, behind every successful company there is a role of the digital agency.  It is important that you make your online presence powerful, effective and influential. You have to be careful about how you sound and look in the industry. It might interest you that most of the people to draw an idea about a company or business through their online presence.

Take Professional Help

One thing that you have to underline here is you should take professional help. You cannot maintain and manage your online presence and platforms unless you have full-time professionals to take care of it. If you feel that your staff members would take care of it then that would be a big flop for you. It is better you take help of professionals like Digital agency and experts who give all their time and efforts to your organization. There are some reasons that you should consider professionals and these are like:

  • Access the Needed Skills

To build an in-house team to manage the entirety of your digital marketing tasks and efforts is a challenge for many businesses.  The skills your business needs are either tough to come by or too pricey. Adding to this, it would not be financially feasible to recruit someone for a full or even part-time placement in case you don’t really need their skills continually and consistently. Of course, the campaigns that you run will modify at diverse times of the year.  Here what you can do is you can outsource your digital marketing tasks.  These digital agencies have a core team of professionals, technicians, and designers who get you the best for your digital presence.

  • Deadlines are Met with Ease
Once you have a solid marketing strategy in place, your company cannot afford to have any pinch of failure that might emerge with an in-house team.  You know a typical agency is going to have multiple redundancies in place. No matter that is multi-person teams doing work on your campaigns, automation solutions, and software or versatile staff members that can support each other, these fellows are in a position to give more certainty that campaigns and objectives get delivered on time. Of course, if you are a business and you fail to meet a deadline; that would tarnish your reputation. 
  • Get Fresh Perspectives
It is apparent that an in-house team is integrally limited in the experiences that they possess. These guys are in somewhat of an echoing space, exposed to the products and industry that you are trying to endorse and the methods used to promote them. However, professional firms do work with a huge variety of industries, business types and of course marketing specialists. These professionals get to learn about and design successful, innovative marketing methods and apply them to diverse sectors of the business community.

Also Read

Benefits of Using Services of Digital Marketing Agencies
SEO in 2019, Which Practices You Should Avoid and Why

Moreover, the employees these agencies have are also needed to meet constant professional development needs. These agencies make sure that their employees, developers, and designers are in touch and updated with the latest trends. Moreover, you are in a position to tap into this insight and expertise in a tremendously effective and targeted way once you partner with a professional Digital agency development.
So, if you want to soar in your industry then you have to make the most of digital space. You have to take steps that are crucial for your business and the foremost step should be to have a word with a professional and innovative digital agency.
About the Author

Ryan Rambajohn is a hands-on marketing professional experienced at launching new business initiatives and developing online marketing programs. He is the founder of Reach Above Media offer Web Design New York including SEO and Mobile App development.

Read More →

Log File Analysis For SEO

Log file analysis is a lost art. But it can save your SEO butt. This post is half-story, half-tutorial about how web server log files helped solve a problem, how we analyzed them, and what we found. Read on, and you’ll learn to use grep plus Screaming Frog to saw through logs and do some serious, old school organic search magic (Saw. Logs. Get it?!!! I’m allowed two dad jokes per day. That’s one).

If you don’t know what a log file is or why they matter, read What Is A Log File? first.

The Problem

We worked with a client who rebuilt their site. Organic traffic plunged and didn’t recover.

The site was gigantic: Millions of pages. We figured Google Search Console would explode with dire warnings. Nope. GSC reported that everything was fine. Bing Webmaster Tools said the same. A site crawl might help but would take a long time.

Phooey.

We needed to see how Googlebot, BingBot, and others were hitting our client’s site. Time for some log file analysis.

This tutorial requires command-line-fu. On Linux or OS X, open Terminal. On a PC, use a utility like cmder.net.

Tools We Used

My favorite log analysis tool is Screaming Frog Log File Analyser. It’s inexpensive, easy to learn, and has more than enough oomph for complex tasks.

But our client’s log file snapshot was over 20 gigabytes of data. Screaming Frog didn’t like that at all:

Screaming Frog Log File Analyzer can handle large files, but 20 gigs is too much
screaming-frog-log-choke-compressed

Screaming Frog says « WTF? »

Understandable. We needed to reduce the file size, first. Time for some grep.

Get A Grep

OK, that’s dad joke number two. I’ll stop.

The full, 20-gig log file included browser and bot traffic. All we needed was the bots. How do you filter a mongobulous log file?

  • Open it in Excel (hahahahahahaahahahah)
  • Import it into a database (maybe, but egads)
  • Open it in a text editor (and watch your laptop melt to slag)
  • Use a zippy, command line filtering program. That’s grep

“grep” stands for “global regular expression print.” Grep’s parents really didn’t like it. So we all use the acronym, instead. Grep lets you sift through large text files, searching for lines that contain specific strings. Here’s the important part: It does all that without opening the file. Your computer can process a lot more data if it doesn’t have to show it to you. So grep is super-speedy.

Here’s the syntax for a typical grep command:

grep [options][thing to find] [files to search for the thing]

Here’s an example: It searches every file that ends in “*.log” in the current folder, looking for lines that include “Googlebot,” then writes those lines to a file called botsonly.txt:

grep -h -i ‘Googlebot’ *.log >> botsonly.txt

The -h means “Don’t record the name of the file where you found this text.” We want a standard log file. Adding the filename at the start of every line would mess that up.

The -i means “ignore case.”

Googlebot is the string to find.

*.log says “search every file in this folder that ends with .log”

The >> botsonly.txt isn’t a grep command. It’s a little Linux trick. >> writes the output of a command to a file instead of the screen, in this case to botsonly.txt.

For this client, we wanted to grab multiple bots: Google, Bing, Baidu, DuckDuckBot, and Yandex. So we added -e. That lets us search for multiple strings:

grep -h -i -e 'Googlebot' -e 'Bingbot' -e 'Baiduspider' -e 'DuckDuckBot' -e 'YandexBot' *.log >> botsonly.txt

Every good LINUX nerd out there just choked and spat Mountain Dew on their keyboards. You can replace this hackery with some terribly elegant regular expression that accomplishes the same thing with fewer characters. I am not elegant.

Breaking down the full command:

h: Leaves out filenames
i: Case insensitive (I’m too lazy to figure out the case for each bot)
e: Filter for multiple factors, one factor after each instance of -e
>>: Write the results to a file

Bots crawl non-page resources, too, like javascript. I didn’t need those, so I filtered them out:

grep -h -v *.js botsonly.txt >> botsnojs.txt

-v inverts the match, finding all lines that do not include the search string. So, the grep command above searched botsonly.txt and wrote all lines that did not include .js to a new, even smaller file, called botsnojs.txt.

Result: A Smaller Log File

I started with a 20-gigabyte log file that contained a bazillion lines.

After a few minutes, I had a one-gigabyte log file with under a million lines. Log file analysis step one: Complete.

Analyze in Screaming Frog

Time for Screaming Frog Log File Analyser. Note that this is not their SEO Spider. It’s a different tool.

I opened Screaming Frog, then drag-and-dropped the log file. Poof.

If your log file uses relative addresses—if it doesn’t have your domain name for each request—then Screaming Frog prompts you to enter your site URL.

What We Found

Google was going bonkers. Why? Every page on the client’s site had a link to an inquiry form.

A teeny link. A link with ‘x’ as the anchor text, actually, because it was a leftover from the old code. The link pointed at:

inquiry_form?subject_id=[id]

If you’re an SEO, you just cringed. It’s duplicate content hell: Hundreds of thousands of pages, all pointing at the same inquiry form, all at unique URLs. And Google saw them all:

Inquiry Forms Run Amok
log-files-inquiry-form-compressed

Inquiry Forms Run Amok

60% of Googlebot events hit inquiry_form?subject_id= pages. The client’s site was burning crawl budget.

The Fix(es): Why Log Files Matter

First, we wanted to delete the links. That couldn’t happen. Then, we wanted to change all inquiry links to use fragments:

inquiry_form#subject_id=[id]

Google ignores everything after the ‘#.’ Problem solved!

Nope. The development team was slammed. So we tried a few less-than-ideal quick fixes:

  • robots.txt
  • meta robots
  • We tried rel canonical No, we didn’t, because rel canonical was going to work about as well as trying to pee through a Cheerio in a hurricane (any parents out there know whereof I speak).

Each time, we waited a few days, got a new log file snippet, filtered it, and analyzed it.

We expected Googlebot to follow the various robots directives. It didn’t. Google kept cheerfully crawling every inquiry_form URL, expending crawl budget, and ignoring 50% of our client’s site.

Thanks to the logs, though, we were able to quickly analyze bot behavior and know whether a fix was working. We didn’t have to wait weeks for improvements (or not) in organic traffic or indexation data in Google Search Console.

A Happy Ending

The logs showed that quick fixes weren’t fixing anything. If we were going to resolve this problem, we had to switch to URL fragments. Our analysis made a stronger case. The client raised the priority of this recommendation. The development team got the resources they needed and changed to fragments.

That was in January:

Immediate Lift From URL Fragments
fragments-implemented-compressed

Immediate Lift From URL Fragments

These results are bragworthy. But the real story is the log files. They let us do faster, more accurate analysis, diagnose the problem, and then test solutions far faster than otherwise possible.

If you think your site has a crawl issue, look at the logs. Waste no time. You can thank me later.

I nerd out about this all the time. If you have a question, leave it below in the comments, or hit me up at @portentint.

Or, if you want to make me feel important, reach out on LinkedIn.

The post Log File Analysis For SEO appeared first on Portent.

Read More →

Google’s Latest Broad Core Ranking Update: Florida 2

Florida 2 Algorithm Update

What’s Going On?

On March 12th, Google released what being referred to as the Florida 2 algorithm update. Website owners are already noticing significant shifts across their keyword rankings. While Google’s algorithm updates vary in terms of how often they receive broad notice, the Florida 2 update is one that every marketer needs to be paying close attention to.

Who Was Impacted by the Florida 2 Algorithm Update?

Google makes several broad ranking algorithm updates each year, but only confirms updates that result in widespread impact. Google did confirm Florida 2, and there are SEOs out there already calling it the “biggest update [Google has made] in years.” Unlike last August’s Medic Update, Florida 2 isn’t targeting a specific niche or vertical, which means the entire search community needs to be paying attention as we try to better understand the changes Google is making to its search algorithm.

While it’s still too early for our team to pinpoint what exactly is being impacted by Florida 2, we’re going to keep a very close eye on where things fall out over the next several days (and weeks).

Here’s what we’ve seen so far:

  • Indiscriminate swings in site traffic & ranking, with some websites reporting zero traffic after the update.
  • Evidence of traffic increases for site owners who are prioritizing quality content and page speed.
  • A worldwide impact – this is not a niche specific or region specific update.
  • Potential adjustments in how Google is interpreting particular search queries.
  • Backlink quality possibly being a very important factor.
  • Short term keyword ranking changes (declines in ranking that then back to “normal” after a few hours).

My Rankings Took a Hit. What Can I Do?

In short? Nothing. But don’t panic.

As with any Google algorithm update, websites will see increases or declines in their rankings. There is no quick fix for sites or web pages that experience negative results from the update; don’t make the mistake of aggressively changing aspects of your site without fully understanding the broader impact those changes will have. If you are being negatively impacted by Florida 2 (or any other algorithm update), your best bet is continuing to focus on offering the best content you can, as that is what Google always seeks to reward.

For advice on how to produce great content, a good starting point is to review Google’s Search Quality Guidelines. For recommendations on copywriting tips and strategies check out the copywriting section of the Portent blog.

We’ll continue to update this post as we gather more information.

The post Google’s Latest Broad Core Ranking Update: Florida 2 appeared first on Portent.

Read More →