Drupal 7 dives into machine-readable web
Later better than never
The machine-readable web has come a step closer thanks to open sourcers in the Drupal community.
Drupal 7 has received sign-off - finally - adding the ability to embed semantic meta-data into sites using the open-source content management system.
That means Drupal 7 is adding native support for the W3C's RDFa - a set of XHTML attributes that are designed to turn human readable data into data that's readable by machines. That could be data such as a location's map coordinates. RDFa is already being used by Google (see here).
In a statement announcing Drupal 7, the Drupal Community said: "RDFa can add value by giving search engines more detail, details not visible to humans."
Drupal 7 had been expected as early as last summer, but it slipped as what appears to be a relatively small number of active committers worked hard to lock down the new system's few remain bugs.
Drupal creator Dries Buytaert reckons Drupal runs one per cent of sites on the web, making it popular but not as popular as WordPress and Joomla, which are first and second respectively.
In an attempt to close this gap, Drupal 7 is bringing changes to attract a broader audience of less tech savvy users. According to Drupal, changes in the user interface are designed to make common tasks easier for 80 per cent of the software's users.
For the more technology faithful, Drupal 7 also brings a built-in test environment, version upgrade manager, and a database abstraction layer for use with MariaDB, SQL Server, MongoDB, Oracle, MySQL, PostgreSQL, and SQLite. ®
Yes, as usual the Reg knows WTF they are talking about
TheReg is probably right.
Here are the Google trends on Wordpress vs Joomla vs Drupal.
The claim that Drupal powers more than 1% of the web comes from Dries Buytaert, leader of the Drupal project. He started a project to build a crawler to categorise sites, and Marc Seeger finished it. Marc wrote his thesis about the crawler. In Dries's April 2010 Drupalcon speech, he revealed that a crawl of the top million sites showed Drupal at 1%. His slides are online. It doesn't say so on the slides, but he also spoke about Joomla and Wordpress stats. I was in the audience, and I remember WP being at about 8%, and Joomla being at about 3. I'm fairly sure about WP, I'm less certain about Joomla. However, Drupal at that stage was definitely behind Joomla.
This also feels right - Drupal is typically used for more complex sites than Joomla. Naturally there will be more less complex sites, so we can expect that there would be more Joomla sites.
The Google trends graph shows that Joomla is dropping in the search rankings, but it's still well above Drupal. Of course this isn't necessarily a reflection of the use of the different CMSs, but I can't think of any sensible reason that it couldn't be.
With regard to whether Wordpress is a CMS. The majority of my business is built around Drupal and I'm heavily involved in the Drupal project. However, one of my clients uses WP very successfully to run their online industry specific newspaper. It's not a heavyweight CMS, but it's definitely improving all the time, and the latest version supports structured data. Remember that it wasn't Habitat that bought Ikea; it's the job of us Drupal developers to make Drupal easier and nicer to use than Wordpress. Drupal 7 is a good step in that direction. The user interface is better, but we still have a way to go to meet Wordpress. On the other hand, the structured data handling in Drupal 7 is in another league compared to Wordpress.
It makes it easier for what a lot of web devs, owners and operators don't want - web scraping.
Of course it might make search engines better at categorising but, realistically, I agree with some post above. It's not for everyone.
Thanks for that Yautja_Cetanu - that's an interesting example.