Blog

NetSmarter.com Blog: Web Traffic and SEO (Page 2)

How to Use Web2.0 Tricks to Get Good Ranking

I just learned about a method to get good Google rankings: Step One: Write a post to Your blog. Step Two: Submit a snippet of the post to sites like Digg and Propeller. Step Three: Bookmark the snippet pages on sites like Delicious, Reddit and StumbleUpon. Step Four: Ping the bookmark RSS feeds using sites like Autopinger and Pingomatic. (Note: Reddit is nofollow until you get at least one... ❯❯❯

Best of the Web Directory (BOTW) Promo Code (New!)

Best of the Web Directory (BOTW) is one of the most useful directories to submit your websites. Use the following promo code to get 20% off the regular submission price on Best of the Web's website submission and sponsorship advertising: ZOMBIE This promo code expires on October 31st, 2008. To use this coupon promo code, click here or the coupon image below to visit BOTW and enter the promo... ❯❯❯

Put your sitemap URL in robots.txt

Did you know as of April 11th there was no longer a need to manually submit your sitemap to search engines. Last fall, the major search engines agreed on a sitemaps format. You can now add a simple line to your robots.txt file and let the engines know where your sitemap document resides on your site. Just include the following line in your robots.txt file and you should be all set: Sitemap:... ❯❯❯

Blog Traffic Data

According to eMarketer.com, comScore estimates that about 30% of US Internet users visited blogs in the first quarter. Nielsen//NetRatings says the top 50 blog sites, including blog hosts, draw about 20% of active Internet users. comScore's blogging study, which was sponsored by Gawker Media and SixApart, estimates that 50 million US Internet users visited blogs in the first quarter of 2005, up... ❯❯❯

Search Engines penalize duplicate content

I have read some articles regarding SEO and duplicate web contents. It seems like search enginese, espeically Google, will penalizes pages for having duplicate contents on it. The duplicate contents might come from other websites. Because Google keeps a large database of websites and it has smart algorithm, it can detect duplicate contents and penalize the pages. So I guess to avoid being... ❯❯❯

Featured Articles: