The Web Traffic News

#1 Website Traffic News Site

0 to 10K Web Traffic Visits Per Month

Here is a detailed and actionable infographic on how to get zero to ten thousand web visitors to your website or blog per month:

10k web traffic
Do you know any web traffic building tips or tricks that you care to share?

Leave a comment below and share this as well,

~The Web Traffic Reporter
Mark Edward Brown

Russian Web Traffic and Social Media Slap On The Way

Recently the Russian media Roskomnadzor contacted Google, Facebook and Russian Web Traffic ControlTwitter, alerting them of the possibility of future censorship and potential outright banning of their services if they don’t work with the Russian government regarding Russian censorship and internet laws.

Roskomnadzor dislikes the encryption of the social media sites and their lack of willingness to share user data, the potential loss to the social media giants could be close to 35 million and growing Internet users in the country.

Russian web traffic, social media, and blogger laws force anyone with over 3,000 readers per day to have their identity verified by a government registration process. Failure to comply with these rules often result in a complete shut down of the suspected site/blog.

This law was enacted a few years ago due to a series of protests against the Russian government. Bloggers online were identified as playing a key role in the organization of protests, by informing people where and when to meet in order to speak out against censorship.

The Russian government also blocks websites that speak out against its government. Google, Facebook and Twitter all incorporate encryption technology that allows the people to speak without censorship making viewpoints impossible to remove, alter, or control.

Currently the Kremlin polices the Internet, while investing more and resources into making sure it can control all aspects of what it’s people see and say, this is of serious concern because they are also creating government-built servers for internet companies. Any company that hires engineers inside of Russia would have to use these servers for the Russian population, which would make monitoring and control easier.

What are your viewpoints on government control of web traffic and internet control? (Leave a comment below & share this on social media)

~The Web Traffic Reporter

How to Drive Traffic to Your Site While Under a Google Penalty

Fix the Google SlapHas this happened to you; your website has fallen out of Google’s good graces and your site has been penalized for one reason or another? If it has, I have great news for you because listed below are the ways that are working for us on how to drive web traffic to your site and recover from a Google penalty.

You will usually notice in Webmaster tools that your site has been penalized by a algorithmic or a manual link penalty. This is never a good thing and it does mean you have some work to do to get back into good ranking territory.

You are going to have to apply a two pronged approach while you’re working on re-ranking your website.  While you are working on getting out from under your Google penalty, you still need to bring visitors to your site.

Here is how to drive traffic to your site while under a Google penalty and get ranked again:

Start Driving Traffic to Your Website Again

One of the best ways to drive no-cost web traffic to your website is with a web traffic service called Traffic Dynamite  The basic concept with this site is that you list your website and show it to other people and in return for those people viewing your website you view theirs.  It’s a traffic exchange and it is the fastest way to get eye balls on your website.  If you can click your mouse this will jump start your page views.

Add New Content to Your Site

Creating new content is a must if you have received an algorithmic penalty like Panda. To fix this problem, create new pages as funnels for new site traffic while your other pages are under the penalty.

In order to boost new content and get it recognized, be sure to add topics that are positioned for discoverability, focus on relevant, long tail keyword phrases.

Make sure to answer questions that your readers are actively searching for.

The new natural links that are created to your new content pages can improve your site’s overall link profile as you work to get out from under your penalty.

Guest blogging is still a viable means to publish valuable content to an engaged audience. Be sure to only publish on quality websites that are relevant to your industry.

Pay Your Way Out

To make up for lost traffic and revenue as a result of an SEO penalty you can pay to give your content a boost with paid promotion.

Run paid ads on the major social networks. This will boost your content, capture traffic, social shares, backlinks, and leads. The social networks that I recommend using paid advertising is: Facebook, LinkedIn, Twitter, and of course Google.

Don’t be afraid to invest money in your content and share it with the world. You might be surprised to know that viral content often gets its start from paid promotion.

Social Media

If a site is under penalty, the traffic from search results could cease to exist.  Therefore, its more important than ever to drive traffic to your site from other sources, like social networks.

You must share your site and new content across all of your social networks to increase exposure and build up your fan base.

When you share your content via social media you’ll be supplementing the traffic that was lost during the time your site was under a penalty. Once a site’s penalty is lifted, you should still continue with the above mentioned strategies to keep your rank and visibility on the rise.

Link Building

Links are still important to your site and when you’ve penalize natural links are the best way to get your site’s link profile cleaned up. Natural links are also the types of links necessary for building the authority, traffic potential and  conversions that were lost as a result of a penalty.

Link exchanges and paid link building is recommended because this is the fastest way to recover lost search engine placement and rankings.  You can still trade links with other reputable sites however, that takes a long time and could work against you if you count on your site to bring in revenue and recognition.

Make sure to use Google webmaster tools to identify any bad links that are linking out from or to your site from someone else’s site that may be penalized as well.  Maintaining a good link profile is now more critical than ever to the positive ranking of your website.

If you are unsure of or unfamiliar with any of the information presented above we always recommend consulting with an SEO professional.  The recommended SEO Traffic Expert that we use is Mark Edward Brown at

Here’s to your web traffic success,

Mark “The Web Traffic Reporter” Brown

P.S. leave a comment below if this article was helpful to you


Over Half of Web Traffic Is Not Human

It happened last year for the first time: bot traffic eclipsed human traffic, this year, Incapsula says 61.5 percent of traffic on the web is non-human.

Now, you might think this portends the arrival of “The Internet of Things”—that ever-promised network that will connect your fridge and car to your smartphone. But it does not.

This non-human traffic is search bots, scrapers, hacking tools, and other human impersonators, little pieces of code skittering across the web. You might describe this phenomenon as The Internet of Thingies.

One thing that gets to me — perhaps you want to generate an extra 100,000 page views for some website? So simple. A programmer friend of mine put it like this, “The basics of sending fake traffic are trivial.”

I’m going to tell you how here, even though I think executing such a script is highly unethical, probably fraud, and something you should not do. I’m telling you about it here because people need to understand how jawdroppingly easy it really is.

So, the goal is mimicking humans. Which means that you can’t just send 100,000 visits to the same page. That’d be very suspicious.

So you want to spread the traffic out over a bunch of target pages. But which ones? You don’t want pages that no one ever visits. But you also don’t want to send traffic to pages that people are paying close attention to, which tend to be the most recent ones. So, you want popular pages but not the most popular or recent pages.

Luckily, Google tends to index the popular, recentish stories more highly. And included with UBot are two little bots that can work in tandem. The first scrapes Google’s suggestions searches. So it starts with the most popular A searches (Amazon, Apple, America’s Cup) then the most popular B searches, etc. Another little bot scrapes the URLs from Google search results.

So the first step in the script would be to use the most popular search suggestions to find popularish stories on the domain (say, and save all those domains.

The first search would be “amazon” The top 20 URLs, all of which would be Atlantic stories, would get copied into a file. Then the bot would search “apple” and paste another 20 in. And so on and so forth until you’ve got 1,000.

Now, all you’ve got to do is have the bot visit each story, wait for the page to load, and go on to the next URL. Just for good measure, perhaps you’d have the browser “focus” on the ads on the page to increase the site’s engagement metrics.

Loop your program 100 times and you’re done. And you could do the same thing whenever you wanted to.

Of course, the bot described here would be very easy to catch. If anyone looked, you’d need to be fancier to evade detection. For example, when a browser connects to a website, it sends a little token that says, “This is who I am!” And it lists the browser and the operating system, etc.

If we ran the script like this, an identical 100,000 user agents would show up in the site’s logs, which might be suspicious.

But the user agent-website relationship is trust-based. Any browser can say, “I’m Chrome running on a Mac.” And, in fact, there are pieces of software out there that will generate “realistic” user agent messages, which Ubot helpfully lets you plug in.

The hardest part would be obscuring that the IP addresses of the visits. Because if 100,000 visits came from a single computer, that would be a dead giveaway it was a bot. So, you could rent a botnet — a bunch of computers that have been hacked to do the bidding of (generally) bad people.

Or you could ask some “friends” to help out via a service like JingLing, which lets people use other people on the network to send traffic to webpages from different IP addresses. You scratch my back; I’ll scratch yours!

But, if the botting process is done subtly, no one might think to check what was going on. Because from a publisher’s perspective, how much do you really want to know?

In the example I gave, no page has gotten more than 100 views, but you’ve added 100,000 views to the site as a whole. It would just seem as if there was more traffic, but it’d all be down at the bottom of the traffic reports where most people have no reason to look.

And indeed, some reports have come out showing that people don’t check. One traffic buyer told Digiday, “We worked with a major supply-side platform partner that was just wink wink, nudge nudge about it. They asked us to explain why almost all of our traffic came from one operating system and the majority had all the same user-agent string.”

That is to say, someone involved in the traffic supply chain was no more sophisticated than a journalist with 10 hours of training using a publicly available piece of software. 

The point is: It’s so easy to build bots that do various things that they are overrunning the human traffic on the web.

Now, to understand the human web, we have to reckon with the logic of the non-human web. It is, in part, shady traffic that allows ad networks and exchanges to flourish. And these automated ad buying platforms — while they do a lot of good, no doubt about it — also put pressure on other publishers to sell ads more cheaply. When they do that, there’s less money for content, and the content quality suffers.

The ease of building bots, in other words, hurts what you read each and every day on the Internet. And it’s all happening deep beneath the shiny web we know and (sometimes) love.

5 Tips for Decreasing Your Blog’s Bounce Rates

bounce rateHaving high bounce rates are an indication that something is going horribly wrong with your blog.

Either your audience isn’t interested in what you are giving them, they are bored, or they might not trust the content.

It doesn’t really matter what the reason is, the important part is figuring out what is wrong and implementing ways to keep the bounce rate as low as possible.

To help you out, let’s go over 5 different tips that should keep your viewers engaged and entertained:

Excellent Content-

First things first, your content. You are running a blog, your content should be the highest quality content that is possible. If you are rushing your posts for some reason, you need to just slow down and realize that your content is the heart of your blog.

Posting poor content means that your blog will be considered poor, it’s that simple. So, want to keep your readers engaged? Get better content.

Internal Links-

After a while, your blog should cover a wide range of topics that are all relevant to your niche. When a reader is interested in that particular niche, they will probably have a reason to stay on your site after one article. Provide them with other links to related articles to help them navigate to the content they want to see. If they find that they are able to find value in more places than just one article, they will have no problem sticking to your site for as long as they want.

Not only do internal links help bounce rates, but they also help you SEO efforts. Search engines look highly on relevant internal links, so use them to your advantage!

Beautiful Designs-

Having a beautiful design on your website will definitely increase the time your users stay on the site. If you have an exciting design, they are going to be much more attracted to it. This helps your brand, the overall user experience, and will help retain as many people as possible.

The only downside to getting a professional design is you might need to hire someone to do it for you. If you are not a professional designer or developer, you might have a hard time making an effective design on your own. Professional designers can be a bit pricey, but if you look around you might find a quality designer that is looking to increase his portfolio for cheap.

Images, Podcast, & Videos-

It’s a proven fact that people will keep engaged to media longer than to a page of text. Using images, videos, and audio files to help deliver your content will definitely help decrease the bounce rates of your blog. Just keep in mind that these videos have to be high quality. Even though people will pay more attention to videos, they won’t stay on them if the video is badly made.

There have been plenty of websites that have hired content creators to help with the implementation of videos and images, and they have had great success. If you are unable to hire any help, then your best bet is to find out exactly what your audience wants to see, then make it as best as you can.

Interaction With Your Audience-

A huge part of a successful blog is the interactions that happen between the blogger and the audience. After a post, you might see a few readers leave a comment or two. It is always best to respond to them as fast as possible to show them that you care about their thoughts and opinions. This also reflects very good on your blog for future readers that are reviewing your content. Responding to comments and messages also gives people a reason to stay and come back to the blog, which helps your bounce rates.

There are many other tips and tricks you can use to reduce the bounce rate on your blog, but these 5 should definitely help you start. Just remember that the faster you take care of your bounce rate issues, the better your blog will be. Also, search engines punish websites with high blog rates, so as soon as those get handled, you should see your rankings get higher.

Ness Garcia is a contributor for MakeAWebsite –a website providing honest and genuine reviews on the top performing web hosts today. They highly recommend to website owners and webmasters.

P.S. – leave a comment below about how this post has helped you keep your blog’s bounce rate low…

Website Traffic Hacker Captured In Spain

Have You been a victim of a Website Traffic Hacker?

Website Traffic Hacker     A 35 year old man from Sweden, was just captured and arrested in Barcelona, Spain.  The man was known online as “SK” and is reported to be a representative of the company Cyberbunker.

Cyberbunker has been reported to be the culprits of numerous denial of service attacks that were recently aimed and millions of blogs and websites throughout the world that halted website traffic flow all over the world.

I reported on the recent attack on WordPress Security world wide over at the Seo Traffic Expert, see that report –> HERE

The Spanish Interior Minister along with Dutch, German, British and US police forces say that the culprit was launching attacks from a mobile van that was outfitted with various types of antennae and tech. equipment that allowed him to attack various computing networks.

The primary motivation of the website traffic attacks against internet e-mail and watchdog company “Spamhaus”, based in London and Geneva.  The attacks were launched as protest for censure.  Cyberbunker claims that Spamhaus has no right to decide what does and does not go on the internet.

Online security, blog security and the value of website traffic are all critical issues that I will continue to report on.  Were you a victim of the recent hacking and cyber attacks?  If you were, I want to hear about it – leave your comments below,

Mark Edward Brown
“The Web Traffic Reporter”