I've been operating a successful ecommerce site since 2001.  When it came time to change our platform (old site was ColdFusion with a custom shopping cart), we searched and settled on nopCommerce.  We felt we would be getting future scalability with a dotNet base and the open platform was appealing.

We switched platforms in late May.  Currently, we are using the Fashion Theme 2.60 (and nop 2.60).  We have experienced a 75 to 80% drop in organic search engine traffic.  While we expected some drop, this is huge and with NO recovery over the next 4 months despite our best efforts.  We put redirects on our top old-site pages, 'discovered' how to use the SmartSEO plug-in, submitted our sitemap to Google, etc.

We now have Google's CSE, custom search engine in use for our Site Search which is much better than Full Text Search or the default Product Search with nop.  A site search needs to search more than Products--it needs to include Categories, Manufacturers, Topics, etc.

We believe the main culprit for our low organic search traffic is that our site is producing Dead Links in two ways:

1.  Old site pages.  Old site url's are still being found by search engine robots and linking to a Page Not Found error.  It's been months, but Google tells me that these should eventually drop off and I have added them to our robots.txt file.

2.  Bad syntax url's.  Example:  www.mywebsite.com/87/p/1768/folding-reclining-chair
If there were no "87/" before the "p/", this would be a good url.  Lots of these bad syntax url's are being produced/exposed upon every site search.  I tried editing the robots.txt file with http://www.mywebsite.com/*/p* (and http://www.mywebsite.com/*/c* and http://www.mywebsite.com/*/m*) which, even if it worked for the search robots, would not fix the dead links being presented in the site search results.

Result:  Bad user experience...Low search engine traffic...Low conversion rate...

Should we, at least, create a custom 404 page?  But, can all 404 errors be directed to one page?

It appears that Google is finding url's to pages that do not exist.  If Google only finds pages (does not create the url), then nopcommerce must be creating the url without creating a page??

Big THANKS to anyone who can help in anyway,

castironcook

10/25/12 UPDATE:

Problem solved (acceptably).

Removing bad url's from being crawled (via robot.txt) was working but, the wait for them to drop out from the indexing was interminable and continued to compromise our site search (approx. 75% of site search results were bad links!).  Google looked at my problem and suggested I remove the url patterns using their Site Removal Tool.  It works!

Still don't know how these mystery url's were created, but the pages did exist at one point and were crawled by Google (I think the 2.60 upgrade fixed it).