Archive for the 'Html / Css'

WordPress 2.5.1

April 28, 2008 No comments

Release, as expected, after about a month of a version of the correction of the various bug fixes and security patches observed after the long work for the 2.5 branch!

Usual upgrade procedure and usually link to the download of WordPress !


WordPress: get_links deprecated (even wp_get_links)

May 17, 2007 3 comments

Upgraded WordPress from 2.1.3 to 2.2 convinced that everything would be successful, and so it was ... or at least almost!

The theme that I used at the bottom of the page includes the insertion of my blogroll divided into two columns (respectively a category each).

The code that allowed this has always been get_links (arguments), namely this:

', '

',' ', FALSE,' name ', FALSE, 1,
-1, 1); ?>

but alas, using this old command get_links was created a strange space between each item and the other, thus making the output painful!

After smadonnato a bit 'in all languages ​​searching for information about on similar issue, I discovered that get_links as wp_get_links are considered deprecated or old, and should be used in their place a new command: wp_list_bookmarks (arguments).

Armed with good patience I am reading the documentation on the site of WordPress up to find that you have to use (in my case) this syntax:

category_name=0&title_li=&before= &after=

  • category_name=0&title_li=&before= &after=
  • &
    show_images = 1 & show_description = 0 & orderby = name '); ?>

    Rimando for more information about using wp_list_bookmarks the official documentation .

    Tag: Categories: HowTo , Html / CSS , Wordpress Tags:

    Sitemap for better placement in the search engines!

    May 15, 2007 5 comments

    Maybe not all webmasters are aware that the sitemap (XML, ROR or TXT) is a powerful tool that allows you to increase the presence of your website in search engines. Thanks to the sitemap engine crawlers are more easily pages on the site and the index in the most correct manner.

    A web tool that facilitates the creation of sitemap is definitely that with a simple click allows you to create your sitemap.

    The types of maps are available XML (for Google), Text (for Yahoo), ROR (for other search engines) and HTML (to be integrated within the site for ease of navigation for users).

    However convenient, WordPress, please refer to a very interesting plugin that automatically regenerates your sitemap whenever you change or insert a new article on the blog! This plugin is called Google Sitemaps .

    The output can see it here (after applying a style sheet): sitemap.xml !

    The site will now be visited more frequently by search engines and thus gain more popularity!

    Tag: Categories: HowTo , Html / Css , Miscellaneous , Wordpress Tags:

    Robots.txt and bots do not poke around more!

    May 15, 2007 4 comments

    You've just created a new site? Do not forget to enter your robots.txt in your root directory.

    What is the robots.txt file?
    (I quote verbatim from Wikipedia)

    The robots.txt file contains the rules used by the crawler (also called spiders) to apply restrictions of analysis in the pages of a website.
    The web crawlers are automated software programmed to perform searches and indexing periodical. In the first stage of analysis of a website control the existence of the robots.txt file to apply any restrictions required by the site's webmaster.

    The fields available are two:

    1. User-Agent, the value of this field contains the name of the robot that must comply with restrictions. With the character '*' The rule applies to any robot;
    2. Disallow, the value of this field contains pages on the site that should be excluded from the robot during indexing. You can specify a specific URL or set of URLs belonging to a pattern.

    HERE you can see an example of a file robots.txt

    An example of this can be written:
    User-agent: Googlebot-Image
    Disallow: /

    in which we tell the bot "Googlebot-Image", which scours the web in search of pictures, do not visit the whole root of our site ... and move on!

    Or again:
    User-agent: * # Applica a tutti i bot
    Disallow: /private_directory/ # Blocca /private_directory/
    Request-rate: 1/5 # Visita al massimo una pagina ogni 5 secondi
    Visit-time: 0700-0845 # Vista soo tra 7:00 AM e 8:45 AM UT (GMT)

    A face wrote the robots.txt file, just save it in the root directory of the site and the bot will learn not to browse more on our website!

    And now it only remains to verify that you have correctly spelled the "code" of our using the robots.txt validator web .

    Tag: Categories: HowTo , Html / Css , Miscellaneous Tags:

    CSSEdit: stylesheets in a few clicks!

    May 15, 2007 No comments

    At the suggestion of Dade` I also tried this beautiful and handy software called CSSEdit .

    It is characterized by a innanziatutto graphica clean and very intuitive, but especially for the ability to give way to those who have a thorough knowledge of CSS to modify at all the details a style sheet.

    You insert the URL of the site that you want to change and you find yourself immediately place in front of the CSS written ... then you can intervene with disarming simplicity of all that created and then continue the style sheet with sempici click! It is a software that greatly simplifies the work of the webmaster because any change is shown in real time in the preview screen (especially useful when working on the margin and on the various posizionamente container "div")!

    Here's a screenshot of the program in action ...

    [Click to enlarge]

    Tag: Categories: Curiosity , Html / Css , Software Tags: