Archive for the 'HTML / CSS'

WordPress 2.5.1

April 28, 2008 No comments

Release, as expected, after about a month of a fix release of various bug fixes and security patches found after the long work for the 2.5 branch!

Usual upgrade procedure and usually link to download WordPress !


WordPress: get_links deprecated (also wp_get_links)

May 17, 2007 3 comments

Upgrading WordPress from 2.1.3 to 2.2 convinced that everything would be successful, and they did ... or nearly!

The theme used by me to the bottom of the page provides the inclusion of my blogroll divided into two columns (respectively each category).

The code that allowed this has always been get_links (arguments), namely this:

', '

',' ', False,' name ', FALSE, 1,
-1, 1); ?>

but alas, using this old command get_links was created a strange space between each item and the other, thus making the output painful!

After smadonnato a bit 'in all languages ​​looking for information about on similar problem, I discovered that get_links as wp_get_links are considered deprecated or old, and should be used in their place a new command: wp_list_bookmarks (arguments).

Armed with good patience I am reading the documentation on the website of WordPress up to find that you have to use (in my case) this syntax:


show_images = 1 & show_description = 0 & orderby = name '); ?>

It may find further information about using wp_list_bookmarks the official documentation .

Tag: Categories: HowTo , HTML / CSS , Wordpress Tags:

Sitemap for better placement in search engines!

May 15, 2007 5 comments

Maybe not all webmasters are aware that sitemaps (XML, ROR or TXT) is a powerful tool that allows you to increase the presence of your site in search engines. Thanks to the sitemap engine crawlers are easier site pages and index in the most correct way.

A web tool that facilitates the creation of sitemap is definitely that with a simple click allows you to create your own sitemap.

The types of maps available are XML (for Google), Text (for Yahoo), ROR (for other search engines) and HTML (to be integrated within the site for easy navigation to the users).

However convenient, WordPress, to refer to a very interesting plugin that regenerates its sitemap every time you change or insert a new article in the blog! This plugin is called Google Sitemaps .

The output can see it here (after applying a style sheet): sitemap.xml !

The site is now more frequently visited by search engines and thus gain more popularity!

Tag: Categories: HowTo , HTML / CSS , Miscellaneous , Wordpress Tags:

Robots.txt and bots not poke around more!

May 15, 2007 4 comments

You've just created a new website? Do not forget to include your robots.txt in your root directory.

What is the robots.txt file?
(I quote verbatim from Wikipedia)

The robots.txt file contains settings used by the crawler (also called spiders) to restrict the analysis in the pages of a website.
The web crawlers are automated software programmed to search and periodic index. In the first phase of analysis of a website control the existence of the robots.txt file to apply any restrictions required by the site's webmaster.

Fields are available:

  1. User-Agent, the value of this field contains the name of the robot that must comply with the restrictions. With the character '*' rule is applied to any robot;
  2. Disallow, the value of this field contains the website pages to be excluded during indexing by robots. You can specify a specific URL or set of URLs belonging to a pattern.

HERE you can see an example of a file robots.txt

An example of writing can be this:
User-agent: Googlebot-Image
Disallow: /

where we tell the bot "Googlebot-Image," which scours the web in search of images, not to visit the whole root of our site ... and move on!

Or again:
User-agent: * # Applica a tutti i bot
Disallow: /private_directory/ # Blocca /private_directory/
Request-rate: 1/5 # Visita al massimo una pagina ogni 5 secondi
Visit-time: 0700-0845 # Vista soo tra 7:00 AM e 8:45 AM UT (GMT)

A face written robots.txt file, simply save it to the root directory of the site and the bot will learn not to pry more in our site!

And now not just check your spelling of the "code" of our robots.txt using the validator web .

Tag: Categories: HowTo , HTML / CSS , Various Tags:

CSSEdit: style sheets in a few clicks!

May 15, 2007 No comments

At the suggestion of Dade` I've also tried this beautiful and handy software called CSSEdit .

It is characterized by a innanziatutto graphica clean and very intuitive, but especially for their ability to give way to those who have a thorough knowledge of CSS to modify in detail a stylesheet.

You enter the url of the site you want to change and you find yourself immediately place before the CSS written ... then you can intervene with disarming simplicity of all that created and then continue the style sheet with sempici click! Is software that simplifies the work of the webmaster why each change is shown in real time in the preview screen (especially useful when working on the margin and on the various containers posizionamente "div")!

Here's a screenshot of the program in action ...

[Click image to enlarge]

Tag: Categories: Curiosity , HTML / CSS , Software Tags: