SEO on new website.

Submolecule

Expert Member
Joined
Nov 26, 2008
Messages
1,000
Hi everyone.

I am the webmaster for a on-line music publishing website,I would like to know if there is anything I can do to improve my search results in Google,Yahoo,Live etc. I have read alot on the web about writing lots of content,linking to other sites and so on. Is there anything else that I need to know to optimize the site for search engines?

Thank You
 

Kloon

Expert Member
Joined
Nov 6, 2006
Messages
1,668
mind giving us the link to site so we can check it out?
 

FarligOpptreden

Executive Member
Joined
Mar 5, 2007
Messages
5,396
Well, you could go for friendlier SEO urls, like instead of having:

Code:
www.morethanmusic.co.za/news/1234.html

...try having something like:

Code:
www.morethanmusic.co.za/news/topic-of-news-item-1234.html
 

LiamRG

Well-Known Member
Joined
Oct 7, 2008
Messages
148
hmmm, same kind of forum app as mybb.

i think before you consider improving seo on your site, improve the consistency first. You are using so many different font sizes on the site that at first glance, the rather harsh on the eyes and not easy to follow.

Your main image at the top of the page needs it quality improved. (include your advertising images in your image optimsiation)

Seriously, you can do more damage if you start bringing people to a site that has been polished.
Do this first then start on your SEO.

Sadly your site was developed then your are only considering SEO. Normally SEO is considered during the development phase. This is not a train smash though.
 

Submolecule

Expert Member
Joined
Nov 26, 2008
Messages
1,000
Thank you all for your help, FarligOpptreden you made a really good point about the URL's, we are busy changing them all. LiamRG I agree 100% with what you are saying,thank you for your input.
 

Optimus01

Well-Known Member
Joined
Nov 16, 2009
Messages
127
I also had a look at your site and you are def doing a lot right. :)

Clean URL's are good, Blogs are good.

I see Google has indexed 313 of your pages, if you have much more than this then you could always resubmit your site map so that it knows about your new content.

Another basic I also like to ge right is Meta tags. This is simple to do and often very effective. I know everyone has their own preference for the length of them etc... I like to try to stick to this format:

Title tag: 70 characters
Meta Description: 220 characters
Meta Keywords: 250 characters
 

guest2013-1

guest
Joined
Aug 22, 2003
Messages
19,800
How would I get more pages indexed with Google?

I've noticed about only 10-15% of my pages being indexed but I have around 2000-5000 links, tags linking in, posts/categories etc in my sitemap
 

Optimus01

Well-Known Member
Joined
Nov 16, 2009
Messages
127
Have you tried re-submitting a recently updated sitemap through Google Webmaster Tools? Here you can also check how many pages Google has actually indexed (Im sure you already know that though ;))

5000 is a lot of pages (relatively speaking anyway) and I have heard instances where Google doesnt like to index so many pages especially if all the pages were created in a short time period.

If this does not help, please let me know I can look deeper into it and get back to you.
 

guest2013-1

guest
Joined
Aug 22, 2003
Messages
19,800
My sitemaps gets downloaded at least 4 times a week by Google themselves, no need to resubmit them (everything is on Google Webmaster)

1 blog entry is relevant to 2/3 categories and contains 10 odd keywords, so 1 blog entry of mine is listed at least 20 times with different keywords/structure in the url itself (I skip date based stuff, looks ugly)

So it's quite easy to get to 5000 urls in the sitemap with only a 100 or so blog entries.

I guess that impacts on the way google indexes?
 

Optimus01

Well-Known Member
Joined
Nov 16, 2009
Messages
127
Interesting...

Im guessing google sees it as duplicate content.

Does Google actually automatically scan your sitemap 4 times a week (on its own accord) or is this a default setting? I know that forced site scans and automatic submissions can be bad especially if nothing has changed on your site.
 

guest2013-1

guest
Joined
Aug 22, 2003
Messages
19,800
Interesting...

Im guessing google sees it as duplicate content.

Does Google actually automatically scan your sitemap 4 times a week (on its own accord) or is this a default setting? I know that forced site scans and automatic submissions can be bad especially if nothing has changed on your site.

It's sometimes lower, sometimes higher, up to every 18 hours a day, depending on how much I post. Not sure if there's a setting somewhere I could set for this?
 

bradrap

Senior Member
Joined
Nov 16, 2005
Messages
851
Iirc it's only the authority sites which can set the Google crawl speed. As far as indexing your sites, it all depends on the age of the domain, made within the last 6 months and it could take months for Google to crawl your site.

Something to help improve this is setting nofollow to pages like your privacy policies, sitemaps, links to other sites pages etc. Nofollow everything but the relevant pages you want indexed, also putting links on your homepage directly to those pages should get them indexed a bit quicker.

Also on your xml sitemap, do you set the changefreq and priority on your pages?
Don't forget to ping your sitemap with every update.
 

guest2013-1

guest
Joined
Aug 22, 2003
Messages
19,800
Yea, I have priorities set. And I do ping my sitemap (including Yahoo and Bing) from my blog (and several other blog indexing thingies) each time I post.

My domain is about 10 years old now, with the others only reaching the age of 1 next month. I did see some traffic picking up after i posted again (after 3 months of posting nothing) almost immediately on my "newer" sites.
 

convalescent

New Member
Joined
Dec 1, 2009
Messages
8
Something to help improve this is setting nofollow to pages like your privacy policies, sitemaps, links to other sites pages etc. Nofollow everything but the relevant pages you want indexed, also putting links on your homepage directly to those pages should get them indexed a bit quicker.

open it up. nofollow/pagerank sculpting doesn't work anymore and it's not going to increase amount of pages that get indexed by G.

if you want pages to get indexed you need to get couple deep links to the inner parts of your site. theer some directories out there that allow deep linking and would be my first start in the link building process

courting the crawl is one thing getting indexed is another. having the sites sitemap downloaded doesn't mean it's being indexed it means G is doing a superficial crawl of our site. you need those deep links to get G to do a deep crawl on your site.

Title tag: 70 characters
Meta Description: 220 characters
Meta Keywords: 250 characters

why the meta keywords? this has been dropped by Google (in 2006) and recently Yahoo announced the same.

ideally my advice is not more than 64 characters for Title Tags and 128 for descriptions.

pointless having more IMO
 
Top