3 Convincing Reasons to take Help of Digital Agency

Web Design Full SEO EducationIt is true that these days, behind every successful company there is a role of the digital agency.  It is important that you make your online presence powerful, effective and influential. You have to be careful about how you sound and look in the industry. It might interest you that most of the people to draw an idea about a company or business through their online presence.

Take Professional Help

One thing that you have to underline here is you should take professional help. You cannot maintain and manage your online presence and platforms unless you have full-time professionals to take care of it. If you feel that your staff members would take care of it then that would be a big flop for you. It is better you take help of professionals like Digital agency and experts who give all their time and efforts to your organization. There are some reasons that you should consider professionals and these are like:

  • Access the Needed Skills

To build an in-house team to manage the entirety of your digital marketing tasks and efforts is a challenge for many businesses.  The skills your business needs are either tough to come by or too pricey. Adding to this, it would not be financially feasible to recruit someone for a full or even part-time placement in case you don’t really need their skills continually and consistently. Of course, the campaigns that you run will modify at diverse times of the year.  Here what you can do is you can outsource your digital marketing tasks.  These digital agencies have a core team of professionals, technicians, and designers who get you the best for your digital presence.

  • Deadlines are Met with Ease
Once you have a solid marketing strategy in place, your company cannot afford to have any pinch of failure that might emerge with an in-house team.  You know a typical agency is going to have multiple redundancies in place. No matter that is multi-person teams doing work on your campaigns, automation solutions, and software or versatile staff members that can support each other, these fellows are in a position to give more certainty that campaigns and objectives get delivered on time. Of course, if you are a business and you fail to meet a deadline; that would tarnish your reputation. 
  • Get Fresh Perspectives
It is apparent that an in-house team is integrally limited in the experiences that they possess. These guys are in somewhat of an echoing space, exposed to the products and industry that you are trying to endorse and the methods used to promote them. However, professional firms do work with a huge variety of industries, business types and of course marketing specialists. These professionals get to learn about and design successful, innovative marketing methods and apply them to diverse sectors of the business community.

Also Read

Benefits of Using Services of Digital Marketing Agencies
SEO in 2019, Which Practices You Should Avoid and Why

Moreover, the employees these agencies have are also needed to meet constant professional development needs. These agencies make sure that their employees, developers, and designers are in touch and updated with the latest trends. Moreover, you are in a position to tap into this insight and expertise in a tremendously effective and targeted way once you partner with a professional Digital agency development.
So, if you want to soar in your industry then you have to make the most of digital space. You have to take steps that are crucial for your business and the foremost step should be to have a word with a professional and innovative digital agency.
About the Author

Ryan Rambajohn is a hands-on marketing professional experienced at launching new business initiatives and developing online marketing programs. He is the founder of Reach Above Media offer Web Design New York including SEO and Mobile App development.

Read More →

Log File Analysis For SEO

Log file analysis is a lost art. But it can save your SEO butt. This post is half-story, half-tutorial about how web server log files helped solve a problem, how we analyzed them, and what we found. Read on, and you’ll learn to use grep plus Screaming Frog to saw through logs and do some serious, old school organic search magic (Saw. Logs. Get it?!!! I’m allowed two dad jokes per day. That’s one).

If you don’t know what a log file is or why they matter, read What Is A Log File? first.

The Problem

We worked with a client who rebuilt their site. Organic traffic plunged and didn’t recover.

The site was gigantic: Millions of pages. We figured Google Search Console would explode with dire warnings. Nope. GSC reported that everything was fine. Bing Webmaster Tools said the same. A site crawl might help but would take a long time.

Phooey.

We needed to see how Googlebot, BingBot, and others were hitting our client’s site. Time for some log file analysis.

This tutorial requires command-line-fu. On Linux or OS X, open Terminal. On a PC, use a utility like cmder.net.

Tools We Used

My favorite log analysis tool is Screaming Frog Log File Analyser. It’s inexpensive, easy to learn, and has more than enough oomph for complex tasks.

But our client’s log file snapshot was over 20 gigabytes of data. Screaming Frog didn’t like that at all:

Screaming Frog Log File Analyzer can handle large files, but 20 gigs is too much
screaming-frog-log-choke-compressed

Screaming Frog says « WTF? »

Understandable. We needed to reduce the file size, first. Time for some grep.

Get A Grep

OK, that’s dad joke number two. I’ll stop.

The full, 20-gig log file included browser and bot traffic. All we needed was the bots. How do you filter a mongobulous log file?

  • Open it in Excel (hahahahahahaahahahah)
  • Import it into a database (maybe, but egads)
  • Open it in a text editor (and watch your laptop melt to slag)
  • Use a zippy, command line filtering program. That’s grep

“grep” stands for “global regular expression print.” Grep’s parents really didn’t like it. So we all use the acronym, instead. Grep lets you sift through large text files, searching for lines that contain specific strings. Here’s the important part: It does all that without opening the file. Your computer can process a lot more data if it doesn’t have to show it to you. So grep is super-speedy.

Here’s the syntax for a typical grep command:

grep [options][thing to find] [files to search for the thing]

Here’s an example: It searches every file that ends in “*.log” in the current folder, looking for lines that include “Googlebot,” then writes those lines to a file called botsonly.txt:

grep -h -i ‘Googlebot’ *.log >> botsonly.txt

The -h means “Don’t record the name of the file where you found this text.” We want a standard log file. Adding the filename at the start of every line would mess that up.

The -i means “ignore case.”

Googlebot is the string to find.

*.log says “search every file in this folder that ends with .log”

The >> botsonly.txt isn’t a grep command. It’s a little Linux trick. >> writes the output of a command to a file instead of the screen, in this case to botsonly.txt.

For this client, we wanted to grab multiple bots: Google, Bing, Baidu, DuckDuckBot, and Yandex. So we added -e. That lets us search for multiple strings:

grep -h -i -e 'Googlebot' -e 'Bingbot' -e 'Baiduspider' -e 'DuckDuckBot' -e 'YandexBot' *.log >> botsonly.txt

Every good LINUX nerd out there just choked and spat Mountain Dew on their keyboards. You can replace this hackery with some terribly elegant regular expression that accomplishes the same thing with fewer characters. I am not elegant.

Breaking down the full command:

h: Leaves out filenames
i: Case insensitive (I’m too lazy to figure out the case for each bot)
e: Filter for multiple factors, one factor after each instance of -e
>>: Write the results to a file

Bots crawl non-page resources, too, like javascript. I didn’t need those, so I filtered them out:

grep -h -v *.js botsonly.txt >> botsnojs.txt

-v inverts the match, finding all lines that do not include the search string. So, the grep command above searched botsonly.txt and wrote all lines that did not include .js to a new, even smaller file, called botsnojs.txt.

Result: A Smaller Log File

I started with a 20-gigabyte log file that contained a bazillion lines.

After a few minutes, I had a one-gigabyte log file with under a million lines. Log file analysis step one: Complete.

Analyze in Screaming Frog

Time for Screaming Frog Log File Analyser. Note that this is not their SEO Spider. It’s a different tool.

I opened Screaming Frog, then drag-and-dropped the log file. Poof.

If your log file uses relative addresses—if it doesn’t have your domain name for each request—then Screaming Frog prompts you to enter your site URL.

What We Found

Google was going bonkers. Why? Every page on the client’s site had a link to an inquiry form.

A teeny link. A link with ‘x’ as the anchor text, actually, because it was a leftover from the old code. The link pointed at:

inquiry_form?subject_id=[id]

If you’re an SEO, you just cringed. It’s duplicate content hell: Hundreds of thousands of pages, all pointing at the same inquiry form, all at unique URLs. And Google saw them all:

Inquiry Forms Run Amok
log-files-inquiry-form-compressed

Inquiry Forms Run Amok

60% of Googlebot events hit inquiry_form?subject_id= pages. The client’s site was burning crawl budget.

The Fix(es): Why Log Files Matter

First, we wanted to delete the links. That couldn’t happen. Then, we wanted to change all inquiry links to use fragments:

inquiry_form#subject_id=[id]

Google ignores everything after the ‘#.’ Problem solved!

Nope. The development team was slammed. So we tried a few less-than-ideal quick fixes:

  • robots.txt
  • meta robots
  • We tried rel canonical No, we didn’t, because rel canonical was going to work about as well as trying to pee through a Cheerio in a hurricane (any parents out there know whereof I speak).

Each time, we waited a few days, got a new log file snippet, filtered it, and analyzed it.

We expected Googlebot to follow the various robots directives. It didn’t. Google kept cheerfully crawling every inquiry_form URL, expending crawl budget, and ignoring 50% of our client’s site.

Thanks to the logs, though, we were able to quickly analyze bot behavior and know whether a fix was working. We didn’t have to wait weeks for improvements (or not) in organic traffic or indexation data in Google Search Console.

A Happy Ending

The logs showed that quick fixes weren’t fixing anything. If we were going to resolve this problem, we had to switch to URL fragments. Our analysis made a stronger case. The client raised the priority of this recommendation. The development team got the resources they needed and changed to fragments.

That was in January:

Immediate Lift From URL Fragments
fragments-implemented-compressed

Immediate Lift From URL Fragments

These results are bragworthy. But the real story is the log files. They let us do faster, more accurate analysis, diagnose the problem, and then test solutions far faster than otherwise possible.

If you think your site has a crawl issue, look at the logs. Waste no time. You can thank me later.

I nerd out about this all the time. If you have a question, leave it below in the comments, or hit me up at @portentint.

Or, if you want to make me feel important, reach out on LinkedIn.

The post Log File Analysis For SEO appeared first on Portent.

Read More →

Google’s Latest Broad Core Ranking Update: Florida 2

Florida 2 Algorithm Update

What’s Going On?

On March 12th, Google released what being referred to as the Florida 2 algorithm update. Website owners are already noticing significant shifts across their keyword rankings. While Google’s algorithm updates vary in terms of how often they receive broad notice, the Florida 2 update is one that every marketer needs to be paying close attention to.

Who Was Impacted by the Florida 2 Algorithm Update?

Google makes several broad ranking algorithm updates each year, but only confirms updates that result in widespread impact. Google did confirm Florida 2, and there are SEOs out there already calling it the “biggest update [Google has made] in years.” Unlike last August’s Medic Update, Florida 2 isn’t targeting a specific niche or vertical, which means the entire search community needs to be paying attention as we try to better understand the changes Google is making to its search algorithm.

While it’s still too early for our team to pinpoint what exactly is being impacted by Florida 2, we’re going to keep a very close eye on where things fall out over the next several days (and weeks).

Here’s what we’ve seen so far:

  • Indiscriminate swings in site traffic & ranking, with some websites reporting zero traffic after the update.
  • Evidence of traffic increases for site owners who are prioritizing quality content and page speed.
  • A worldwide impact – this is not a niche specific or region specific update.
  • Potential adjustments in how Google is interpreting particular search queries.
  • Backlink quality possibly being a very important factor.
  • Short term keyword ranking changes (declines in ranking that then back to “normal” after a few hours).

My Rankings Took a Hit. What Can I Do?

In short? Nothing. But don’t panic.

As with any Google algorithm update, websites will see increases or declines in their rankings. There is no quick fix for sites or web pages that experience negative results from the update; don’t make the mistake of aggressively changing aspects of your site without fully understanding the broader impact those changes will have. If you are being negatively impacted by Florida 2 (or any other algorithm update), your best bet is continuing to focus on offering the best content you can, as that is what Google always seeks to reward.

For advice on how to produce great content, a good starting point is to review Google’s Search Quality Guidelines. For recommendations on copywriting tips and strategies check out the copywriting section of the Portent blog.

We’ll continue to update this post as we gather more information.

The post Google’s Latest Broad Core Ranking Update: Florida 2 appeared first on Portent.

Read More →

Brand Update : Maruti Attempts Channel Differentiation using Nexa and Arena

In 2015, Maruti launched a new channel to cater to the company’s foray into the premium four-wheeler segment. The new channel was branded as Nexa. Nexa catered to the premium range of cars from Maruti like Baleno, Brezza, Ciaz etc. The Nexa showrooms were designed to project a premium feel for the segment of customers that prefer premium cars.
It was a challenging move for the company because the multi-channel strategy has its share of problems like organizational challenges, differentiation challenges, channel conflict, redundancy etc. The need for a premium channel arose because Maruti started off as a maker of affordable cars. Its range of cars was never targeting the premium segment. Some of the earlier forays of Maruti into the premium segment also failed to achieve traction. 
Recently the company tasted its success with the mid-range and premium segment with some smart brand launches. Having a premium channel like Nexa enabled to company to give a different kind of experience to this segment.
In 2017, the company rebranded its traditional channel to Maruti Arena which sells the mass market brands like Alto, Wagon-R etc. The company also have a commercial channel and True Value channel for used cars. 
In theory, the company has followed a Marketing Channel Segment Differentiation strategy where the channels differ in terms of the customer segment which they cater. Another type of channel differentiation is Task Differentiation where the channel members differ in terms of the tasks they perform. 
Although the company has tried to create channel differentiation, there is still overlaps in terms of products. While some brands are exclusive to Nexa, some are available in both the outlets which can create potential channel conflict. Marketing Channel differentiation based on segments works well when there is a minimum overlap of characteristics across segments. Here in the case of automobiles, there is bound to be segment overlaps because of pricing overlaps as well as aspirational factors. We can see that the price of high-end variants of hatchbacks is equal to the price of certain mid-size entry-level variants. So the firms who have a channel differentiation strategy should ensure that these overlaps be kept a minimum.
 But with the huge market share and product sales, the channel members at this point of time may not be complaining too much about the cannibalization of sales. 

Read More →