Manage Javascript Code On Your Blog In A Better Way Using Google Tag Manager

Google Tag Manager, another awesome tool from Google’s arsenal can be used to manage javascript snippets on your website in a better and more efficient way. Your site could contain multiple javascript snippets like – For Eg: Google Analytics code, alexa’s Analytics code etc. With Google Tag Manager you can organize all these codes in one place which will be much easier to manage and also you wouldn’t need any more technical knowledge than just a free hand to copy-paste some codes.

google-tag-manager

 

Although this tool has been around since 2005 but i think not manypeople are utilizing it to increase their productivity. This is the reason which tempted me to write this post on – How To Setup And Manage Google Tag Manager.

So What Is A Tag ?

A tag (in reference to Google Tag Manager) could be defined as javascript snippet which sends information from your site to third-party applications like Google Analytics, Adwords, Crazy Egg etc.

Google Tag Manager operates around three main elements: Tags, Rules and Macros.

  • Tags = A tag is HTML code that executes (or fires) on a page.
  • Rules = Specify when a tag should be fired such as when a specific page is loaded or in response to an action on the page.
  • Macros = a pre-defined set of conditions that allow rules to be executed.

Why You Need It ?

With so many different codes on your webpage it becomes bloated and bogs down on speed. With Google mentioning in clear words that How Speed Is Now A Ranking Factor, It could very well throw your site among the one’s being penalized. This being the biggest factor but apart from that it also provides a peace of mind while managing different codes just from a single place.

How It Works ?

Google Tag Manager works via its own container tag that you place on all your website pages. The container tag replaces all other tags on your site, including tags from AdWords, Google Analytics, Floodlight, and 3rd party tags. Once the Google Tag Manager container tag has been added to your site, you update, add, and administer additional tags right from the Google Tag Manager web application.

How To Implement It ?

To Implement Google Tag Manager on your site follow these simple steps :-

Setup An Account :

Creating a Tag Manager account is easy, you just need to sign-in here with your Google Id and you are good to go.

Create A Container For Your Website :

If you have multiple websites it is better to create multiple containers for each of them so that it will be easy to identify.

Add The Container Snippet Code To Your Website’s Source Code :

Once you have successfully created your account and a container for your website, you will be presented with a container code which you need to add to your site’s source code. This should not be big task for an average user as you just have to copy and paste the code.

Create Tags For Javascript Codes On Your Website :

The next step is to create separate tags for every different javascript code on your website. For Example – If you have codes for Google Analytics, Adwords And Crazy Egg then you need to create different tags for each of them.

Publish The Tags :

Lastly, the tool will show you a preview of what you have just setup and you can click Publish.

Remove The Javascript Code From Your Website :

Now that you have created the tags on your Google Tag Manager account and implemented the container code on your website – you don’t need your old javascript code for Google Analytics and other snippets on your website, remove them and your site should now be much lighter as compared to earlier.

Conclusion :-

Using Google Tag Manager at first could seem like a lot of work but while looking at the big picture i.e., now that you don’t have to change multiple codes for a minor change at different places would make you blessing Google for this awesome tool.

So, Have you moved to Google Tag Manager yet? How did you find the transition?

Don’t Let This New Algorithm Update From Google Affect Your Site’s Traffic

google-algorithm-update-2015
As Neil Patel of Quicksprout is saying “Don’t Get Caught With Your Pants Down” with this new update from Google. With this new algorithm update Google is literally pushing webmasters to make their site’s more mobile platform friendly.

The unusual thing about this update is that Google is letting everyone know before hand what changes the update will bring. According to them, it will be much bigger than their ‘Panda’ update.

So What Google Is Upto With This Update :

In layman terms, this update will encourage webmasters to make their site’s more friendly so that they don’t loose traffic. As a large chunk of user base is on mobile devices so it’s natural for Google to make ‘Mobile Friendliness’ as a ranking factor.

Google is going to update their algorithm again, and if you’re on the wrong side of the law you might lose a lot of traffic.

The update will be in effect from 21’st of April and you should try to be as ready for it as possible if you don’t want your mobile traffic to drop.

Changes Needed :

Responsive Design :-

Your website design should be responsive so that it doesn’t hamper view-ability across multiple platforms. Don’t go with a separate site solution for mobile devices as they are hectic to maintain and responsive design does the job more easily.

Speed-It-Up :-

Site-Speed has became a major ranking factor recently so it is more than necessary to speed up your website to be in the good graces of Google almighty. Whether you are targeting desktop or mobile traffic or both your site must be considerably faster, means no more that 2.5 to 3.0 seconds loading time.

You can use this guide to optimize your website for speed.

Conclusion :-

So, as many experts are suggesting – don’t wait for this update to hit your rankings make the changes before-hand. If you’re thinking about making these changes after 21st of April then you will be in trouble as Google will need sometime to crawl your site and making necessary updates to it’s index…#Don’tWait

How “WordPress SEO By Yoast” Could Get Your Site Hacked [Security Alert]

If you are running your site on WordPress platform and using this [Wordpress SEO By Yoast] awesome plugin then i would advise you to first update it before further reading the article, if not already. A huge flaw was found in the plugin by a freelance security consultant Ryan Dewhurst which puts your site in danger and could even get it hacked.

wordpress-seo-by-yoast-security-bug

You can read more about the technical aspect of the bug from WPScan Vulnerablility Database.
According to it ”

The authenticated Blind SQL Injection vulnerability can be found within the 'admin/class-bulk-editor-list-table.php' file. The orderby and order GET parameters are not sufficiently sanitised before being used within a SQL query.

In layman’s terms a malicious hacker could change your database by making an logged-in author visit a malformed URL through Social Engineering.

The severity of the bug was so huge that it made the WordPress team to force-push this update by which the plugin will be updated automatically if the auto-update feature is not turned-off. The update will be automatically rolled to you if you are,

  • running on 1.7 or higher, you’ll have been auto-updated to 1.7.4.
  • If you were running on 1.6.*, you’ll have been updated to 1.6.4.
  • If you were running on 1.5.*, you’ll have been updated to 1.5.7.

Yesterday Yoast team released a blog-post outlining the bug and what they did to patch the flaw.

So all in all if your on older version of the plugin then you must update it as soon as possible to avoid any risks of your site getting hacked or compromised.

Note: WordPress SEO By Yoast Premium users need to manually update the plugin by going to Plugins->Installed Plugins->Wordpress SEO By Yoast and clicking on ‘update plugin’.

How To Create And Optimize Robots.txt For Search Engines

Robots.txt, a file residing into the root directory of your website which gives directions to spiders and crawlers, is one of the most under appreciated factors in your SEO list. This file follows Robots Exclusion Standard also known as Robots Exclusion Protocol. It is a standard used by websites to communicate or direct web crawlers and spiders on whether to crawl a certain webpage or not.

robots.txt

According to Wikipedia : The standard specifies the instruction format to be used to inform the robot about which areas of the website should not be processed or scanned. Robots are often used by search engines to categorize and archive web sites, or by webmasters to proofread source code. Not all robots cooperate with the standard including email harvesters, spambots and malware robots that scan for security vulnerabilities. The standard is different from, but can be used in conjunction with, Sitemaps, a robot inclusion standard for websites.

Why Should You Care About Robots.txt?

  • Improper usage of the robots.txt file can hurt your ranking
  • The robots.txt file controls how search engine spiders see and interact with your webpages
  • This file is mentioned in several of the Google guidelines
  • This file, and the bots it interact with, are fundamental parts of how search engines work

What You Should Do First :-

  • Check if you have a robots.txt file already.
  • If yes, whether it’s blocking important files from crawlers and spiders.
  • If no then Do you need it ?

Determining The Existence Of Robots.txt :-

To check whether a robots.txt file exists already or not, you just have to enter your url into the address bar and concatanate it with /robots.txt.

For Example :- wwww.technonerdz.org/robots.txt

Determining Robots.txt’s Effect On SEO :-

To determine whether your robots.txt is blocking important files which could help search engines rank your page, you can use this tool by FeedtheBot. The tool works mainly on Google’s guidelines for webmasters.
But to understand completely how robots.txt works you need to understand it’s contents by yourself.

Keep reading to learn whether your site needs a robots.txt file or not.

Need Of A Robots.txt file For You ?

There are many cases where a website doesn’t need a robots.txt file but including one doesn’t hurt anyone either. But if you are not sure whether your site needs it or not you refer to the following points and if any one of them stands true for you then you must have a robots.txt file.

  • You want some of your content to be blocked from search engines and site crawlers.
  • You want your underdeveloped but live site not to be indexed until it is fully developed.
  • You want to block malicious bots from crawling your site and unnecessarily loading up your server.
  • You need to give proper directions to bots for affiliate or paid links on your site.
  • You need one or all of the above things.

In case you decide that you are better-off without a robots.txt file – it’s Ok but in that case bots with full access to your site and if you want to create this file you can follow the easy guidelines below.

How To Create Robots.txt For Your Site :-

Robots.txt is nothing but a text file in your sites root directory. To create one – just open a text editor and start typing the directives you want for the crawlers.

Directives :-

Allow Indexing Of Everything : If you want the spiders to crawl and index everything on your website add these rules to your robots.txt.

User-Agent: *
 Allow: /

Disallow Indexing Of Everything : To block the spiders from your site completely, you need to use these directives.

User-Agent: *
Disallow: /

Disallow Indexing Of A Specific Folder : Add these directives to block just a specific folder on your site to the crawlers.

User-Agent: *
Disallow: /folder/

Disallow Access To A Particular Bot : Sometimes you want to block access for a particular bot because of many reasons like content scraping, spamming or a bot with malicious activities.

User-Agent: Googlebot
Disallow: /

Set Crawl Rate : Setting crawl rate means advising crawlers and spiders about the amount of traffic they can send to your site in a given amount of time. Note that it could make Google and other search engines reduce the frequency they visit your site.

User-agent: bingbot
Crawl-delay: 10
where the 10 is in seconds.

Note that Google doesn’t provide support for crawl delay directly from the robots.txt but you can set crawl limit from the webmasters tool.

Conclusion :-

So now that you know how to create and use a robots.txt file, it’s up to you to implement it on your site. To get a nice further reading i must recommend this great article from SEOBOOK – Robots.txt Tutorial.

If this article helped you create a highly optimized robots.txt for your site or if you find it useful enough then don’t forget to share it among your peers.

5 Places To Share Your Killer Quality Content For Insane Traffic

If you’ve finished writing a killer quality article and waiting for the traffic to come organically then let me be the bearer of bad news, You’re Doing It Wrong..!! Writing quality article alone doesn’t guarantee that it will rank high on search engines, though half of the work is done but the remaining half is much harder as compared to writing quality content.

Share_Your_Killer_Quality_Content_For_Insane_Traffic

Thousands of bloggers churn out millions of quality articles daily which don’t get ranked well just because of one single reason which is they don’t know how to promote it well enough among their targeted audience. For an article to rank well in search engines, the author must optimize it using On & Off page S.E.O. techniques. As the search engines are getting smarter by regularly updating their indexing algorithms, they are also taking social reach of an article into consideration for ranking. If a post is actively shared or liked in social media it sends a much positive signal to search engines than just the quality of content.

So now the question arises Where To Share Your Killer Quality Content For Insane Traffic ? If you don’t know the answer to this question then this post is going to help you immensely, here i have listed some of the biggest social media sites which you can use to get thousands of targeted traffic to your blog in a very short span of time.

5 Awesome Places To Promote Your Quality Content :-

REDDIT :

A social media place which categorizes user generated links in sub-reddits based on their category. It is also called as the front-page of the internet. After a link is posted, users can vote-up or down if they like the post or not which in-turn decides the position of the link on the page. It is one of the biggest traffic source for this blog and also my favorite social platform.

You cannot realize the insane amount of traffic it can send to your site until you use it. I submitted a very casually written article on How To Find Bugs In A Website to one of it’s sub-reddits and what i’ve got was near about 11,000 views from targeted users in a day. So you can just assume what a nice in-depth article would bring. In my suggestion if you have a nice article then you must submit it to REDDIT but do not spam as it follows one of the most strict policy against spammers and can ban you in minutes.


StumbleUpon :

StumbleUpon – a content curation website, it is one of the oldest and popular social bookmarking site. You just need to create an account and share your article with right tags, and description and wait. If your article is found and liked by users it will be added to it’s index.

It works in this way – after you submit your url it is showed to a limited number of users and if it is liked and up-voted then it is added to it’s index but if not it will be dropped. Some 7 months ago i submitted an article on Top 50 Hacking Tools but didn’t got any traffic but last month a user with only 90+ likes found my article and up-voted it and since then i got around 20,500+ views on that article alone and it’s been liked by some 1700 stumblers. Getting traffic from StumbleUpon could take some time but the wait is totally worth it.


Google+ :

Yes Google+, many of us don’t realize the affect Google+ can have on our posts ranking in search engines especially Google. Though there is no official word whether it affects search rankings or not but my research shows that a post which is shared handsomely on Google+ tends to rank higher as compared to articles with shares on other social platforms.

Being a new social media platform it don’t have that much user-base which is why you must use it, as there is less competition your posts can rank higher for more time as compared to other social sites. Google+ communities also offer decent traffic if used correctly. Be careful before posting as some communities only allow discussions and not links, ignoring the rules can get you flagged as a spammer and moderators could ban you.


Facebook :

Creating a Facebook page for any blogger for his brand is a must. With 1.35 Billion users of which 864 million users are regularly active makes Facebook one of the most promising and prominent social media platform for bloggers. It is the top source for referral traffic for many famous bloggers like Harsh Agrawal of ShoutMeLoud and Neil Patel of Quicksprout.

Though i’m not a fan of paid advertising but Facebook’s advertising program could feature your content to a wide variety of audience in a fraction of time and cost.


Twitter :

A micro-blogging platform with a nice user-base which can be leveraged to direct traffic to your blog. You can tweet your post link multiple times for maximum exposure but be sure to wait atleast 12 to 15 hours between each tweet. You can also use services like Tweriod to determine which is the optimum time for tweeting your blog link.


So these are some the biggest social platforms which you must utilize after writing your posts. Though there are many more places for blog promotion, following this list is a must for every blogger. Here are some more niche based social-bookmarking websites where you can share your posts.

  • Scoop.It
  • Inbound.org : For SEO, SMO, Blogging and internet marketing related articles.
  • DZone : For Developers
  • DesignFloat : For Web Designers.
  • ManageWP.org : For WordPress related articles.

If there is another source which is working great for you then why don’t you drop a comment and let me know about it. It could be a great addition to the article. Cheers..

How Optimizing Images Could Get Your Site Higher Traffic

Image optimization is one of the most important factor for any website yet it is often overlooked by webmasters. Most of us don’t realize the amount of traffic these images (high quality) could bring to our sites. If the word of Google is anything to go by then images account for 65% of total internet traffic.

 

save-images-for-higher-seo-and-traffic1

 

Optimizing images contributes a large part to On-Page optimization as they could provide a broader impression on the user. If you are running a blog then sometimes a focused high quality image could drive more traffic than your article. Every webmaster must optimize images on his site as most of the search engines could not see them while crawling as they appear as blank spaces to them.

Peter Linsley of Google have shared in the past the process of how images are ranked as per Google :

  1. Firstly the crawler finds images from the webpages across the world wide web.
  2. Then they are classified on various factors like color, quality or whether it is safe for work or not and then indexed.
  3. The next process they go through is identifying the duplicates and sorting the most authoritative one.
  4. Then a rank is given based on undisclosed factors just like the text based websites.

He also goes on to say that users prefer high resolution, in-focus images as compared to low quality one’s. Distracting features such as loud water-marks and huge on-image texts are a complete NO as they make the image look cluttered.

According to him some best practices that a webmaster should follow while attaching images to his webpage are :

  • Try to add a image which is user-focused rather than search engine-focused.
  • Images should be high quality and large enough for the user to interpret it correctly.
  • Always try to put it higher on the page and above the fold if possible.
  • Adding descriptive text for the images could help search engine immensely to categorize them correctly.

Now that you know how much emphasis a single image could bring to your blog you must be thinking what are the factors that you should optimize so that the images bring handsome traffic and ranking juice from search engines – So without further ado let’s get to it.

Relevency :-

Images should be relevant to the article or text around it. An image of waterfall won’t be good enough if the page or article is about Search Engine Optimization.

Alt Tag :-

This is the most important tag on an image. This is the alternate text which is showed when the image could not be loaded by the browser. This text also helps image search engines to identify the images.

Name Of The Image :-

The file name of the image is another way of improving your image’s S.E.O. capabilities. Google requests webmasters to not use generic file-names like ‘image010.jpg’, instead use something like ‘wordpress-logo.jpg’ if the image is of wordpress logo.

Image Size :-

An image which is too bulky and high in size could increase your page latency which could in-turn cause a fall in rankings. So compressing images before serving is always welcome.

Height & Width :-

Images which are to be attached should be of optimal size means they should already be in size in which they are to be presented and you shouldn’t be re-sizing them using HTML tags as it slows down the page loading.


These are the points which anyone must follow before adding the images. If you are using a Content Management System like WordPress you could easily let it all happen automatically by using a plugin. Smush.it from Yahoo is a great tool which will strip unnecessary meta data from the images and improve their size immensely. S.E.O. Friendly Images is another great plugin to update all alt and title tags for all images on your website.

You can also save an image for web which would be considerably low in size by using Photoshop software. A video guide demonstrating how to reduce the size of an image using Adobe Photoshop is attached below.



These were some of the recommended methods from my side but if you want to further optimize your images do read this awesome article on Shopify and don’t forget to provide your valuable feedback in the comments below.

 

What To Look For In The IT Consultants You Would Want To Work With

All business owners know that whatever kind of product they offer or service they provide, they will always find themselves facing steep competition. Because of this, it is important for all entrepreneurs to fully utilize and maximize all available options to make sure that they can always get ahead of their industry rivals.

 

hire-an-it-consultant

 

The Internet is one of the platforms that all business owners should know how to correctly and fully utilize to promote their brand. And if you are a business owner, to promote and boost your company’s online presence, you need to have a good website first.

If you don’t have a business website yet or if you want to give your current one a tweak and you don’t currently have any staff members capable of accomplishing this, don’t worry — you can get help from professional IT consultants to achieve your goal.

At present, there are dozens of IT consultancy firms you can get help from. But not all of these firms will definitely give you what you want; hiring just any proclaimed IT firm might cause you to end up wasting your time and money. To make sure that you choose the right IT consultants who can help you reach the results you seek, keep the following tips in mind:

Always go with an IT firm that is highly rated. Choose an IT consultancy company that comes highly recommended by their previous and present clients. To know if an IT consultancy firm is indeed reliable and can deliver on their promises, be on the lookout for positive reviews about the company. You can also visit the IT firm’s website to read about their clients’ success stories and testimonials.

Be sure to check out the IT firm’s portfolio before making your decision. An IT consultancy company that has a strong portfolio means that the team has good working ethics and professionalism. To find out if an IT company really did work on the projects they included in their portfolio, search for some live links of the websites they claim to have completed. While you’re doing this, scroll down to the bottom part of the website they created and see if they are “powered by” the IT consultancy company you are thinking of hiring.

Choose IT consultants who listen to your needs and wishes. Although you may not have an idea of what exactly you want your website to look like, don’t immediately say “yes” to all the suggestions of the IT consultant you are working with. A good IT consultant will really listen to your requirements and take note of all the objectives you want to gain with your website, incorporating all of these to fully satisfy your needs and help you achieve your goals.

Finally, look for a company that offers affordable IT services. Don’t settle for one that provides cheap services since you may just end up having a shoddy-looking website or one that doesn’t even match your requirements. Make sure that you compare the service rates of several IT consultancy firms and choose one that offers the best value for your money.

GronTec

Website Verification And Other Strategies For Improving Conversion Rates

Top marks on search engine results. Thousands of social media follows. Hundreds of clicks on your PPC campaign. Even if you sum all these up, they will mean nothing if your website has low conversion rates.

But what exactly are conversion rates? In simple terms, conversion rates refer to the number of prospective customers who perform the action that you prod them to do through your campaigns, whether online or offline. For example, this may refer to the percentage of customers who opt in to your mailing list or new customers hauled in by your latest print ad. Apart from increased profits, high conversion rates mean that you got good returns on your investment.

 

improve-conversion-rates

 

If you run an online business, there are several measures that you can implement to improve conversion rates. Do mind that you will have to contend with bounce rates, but by putting these strategies into place, you can minimize your bounce rates and improve conversion rates.

Use social proof. Sometimes, no matter how well-written your copy is, most customers still look at what other customers have to say about your business and its products or services. Social proof like testimonials and third-party reviews may be utilized to show the quality of your products and even your customer service.

Present complete contact information. One problem with the Internet is that people can hide under the veil of anonymity or fake profiles. Earn the trust of customers by putting contact information on your Web page.

Build credibility with the help of third parties. Earn trust seals by going through a stringent website verification process. This lets your customers know that you have undergone scrutiny in order to ensure their online safety.

Offer guarantees. Guarantees offer the impression that you are confident in your product or service. And the longer your guarantee period, the greater the confidence your prospective customers will have.

Improve the headlines of your pages. Headlines lure in page visitors. Compelling headlines translate to conversions. Make it a habit to conduct A/B testing in order to gauge which design and copy work best with your customers.

By no means are these measures to be considered shortcuts. You cannot expect quick results by implementing these strategies. However, with patience and proper implementation, you can earn the trust of prospective customers and guide them to your website — and actually make them stay put and perform your desired action on your website.

ValidSafe

How Not To Get Penalized By Google : Post Pigeon & Panda Updates

In today’s world if you want your website to get decent organic traffic then you must surrender to the rules and regulations of the almighty GOOGLE.

Google ranks and indexes websites on the basis of various factors like keyword density, meta tags, no. of inbound links and many more. The specifics are not known to anyone other than Google itself but the algorithm it uses keeps on updating from time to time.
Major Algorithm Updates In 2014 :

Panda 4.0 – May 19, 2014 :-
A complete update to algorithm as well as a data refresh.
Pigeon – July 24, 2014 :-
Google shook the local SEO world with an update that dramatically altered some local results and modified how they handle and interpret location cues. It claimed that Pigeon created closer ties between the local algorithm and core algorithm(s).
google-pigeon-update


HTTPS/SSL Update – August 6, 2014 :-

With this update, Google announced that they would be giving a slight boost and preference to secure sites in ranking.
Authorship Removal – August 28, 2014 :-
Google completely removed authorship markup’s from search results after this update.
So How Not Too Get Penalized :
Avoid Violations Of Terms/Conditions :-
Google takes violations of it’s terms and conditions very seriously and would penalize your website before you know it. Try to avoid it at all costs and don’t think of outsmarting the system as they have got Zettabytes of data to compare to your little tricks.
A ‘Big No’ To Keyword Stuffing :-
Do not stuff keywords into the content of your site, try to use them as naturally as possible. A keyword density of 1.5% – 2.0% is considered good practice.
Quality Backlinks :-
Backlinks from unreliable or spammy sites could get your website penalized in no-time, avoid them at all costs.
Promote With Social Media :-
According to Google, there is no limit to promotion on social media. Create accounts on Facebook, Google+, Twitter, Pinterest etc..
Guest Posting :-
You can guest post on several forums and blogs which in turn will give you backlinks and decent referral traffic to your blog or website. Google doesn’t penalizes this practice.
Conclusion :-
Hope the information provided here could be of help to the readers. Follow the above mentioned methods before trying anything else and don’t pay the so called S.E.O. agencies for increasing your backlink count and optimization as it would be seem inorganic to search engines.
Wait patiently and let your site grow gradually and slowly.
Do provide feedback on how you liked or disliked this post in the comments section below.