I want to share some search engine optimization considerations that I implemented on this website.
While a few years ago one was easily able to fool search engines, it has become harder. What really matters today is content and regular updates. A nicely designed web page, as well as high-quality images, are key to make it to the top of the search rankings.
Search engine friendly
More important is the readability of links as well. My web server rewrites all URLs and makes them 'readable' for search engines and humans. Filenames, for example of images, should also make sense and not just be numbered. More about URL rewriting is in the .htaccess.
Keywords and metadata
All pages have metadata embedded. This helps search engines to understand better what a page is about and at which keyword combination a page is more interesting for a user.
Search engines also rank how fast a website is. Google provides a helpful website here, which ranks the website and gives hints what to improve next. In order to comply with various requirements, I adjusted my .htaccess, robots.txt. I make full use of the Joomla caching and compressing abilities. Additionally, I do use RokBooster and the Gantry framework which is able to combine and compress CSS, js, and other file types. Images are cropped and reduced in file size.
I use OSmap to keep my sitemaps up-to-date. Find my general sitemap here and for images look here. There is also a HTML version and a news version that OSmap now properly supports. Very easy installation / implementation.
HTTPS and the SSL usage
While it is most important for e-commerce websites to have proper secure connections between the server and the customers device, search engines started to give bonses to pageranks for websites using SSL as a standard. The implementation was through my host and very easy. Joomla furthermore supports SSL well and with the htaccess file mentioned earlier, one can enforce the wanted protocol.
Google webmaster tools and analytics
Google has two more tools that can be used to monitor the website. They yield insights into how google sees your web page, thus providing help especially with (for robots) blocked content. They also recommend steps and adjustments for example for server settings. Interesting and a rather new development is the content highlighter functionality in the webmaster tools. With a little help, this self-learning algorithm structures and reads data from your website much more efficient.