Archive

Archive for the 'HTML / CSS'

WordPress 2.5.1

April 28, 2008 No comments

Release, as expected, after about a month in a fix release of various bug fixes and security patches found after the long work for the 2.5 branch!

The usual upgrade procedure and usually the download link of WordPress !

smileys

WordPress: get_links deprecated (even wp_get_links)

May 17, 2007 3 comments

Upgraded from WordPress 2.1.3 to 2.2 convinced that everything would be successful, and so it was ... or almost!

The theme that I used at the bottom of the page includes the insertion of my blogroll divided into two columns (respectively each category).

The code that allowed this has always been get_links (arguments), namely this:

', '

',' ', False,' name ', FALSE, 1,
-1, 1); ?>

but alas, using this old command get_links was created a strange space between each item and the other, thus making the output painful!

After smadonnato a bit 'in all languages ​​searching for information about on similar issue, I found that get_links as wp_get_links are considered deprecated or old, and should be used in their place a new command: wp_list_bookmarks (arguments).

Armed with good patience, I am reading the documentation on the site of WordPress up to find that you have to use (in my case) this syntax:

category_name=0&title_li=&before= &after=

  • category_name=0&title_li=&before= &after=
  • &
    show_images = 1 & show_description = 0 & orderby = name '); ?>

    Cross-reference for more information about using wp_list_bookmarks the official documentation .

    Tag: Categories: HowTo , HTML / CSS , Wordpress Tags:

    Sitemap for better placement in search engines!

    May 15, 2007 5 comments

    Maybe not all webmasters are aware that the sitemap (XML, ROR or TXT) is a powerful tool that allows you to increase the presence of your site in search engines. Thanks to the sitemap engine crawlers are more likely site pages and index more correctly.

    A web tool that facilitates the creation of sitemap is definitely XML-Sitemaps.com that with a simple click allows you to create your sitemap.

    The types of maps are available XML (for Google), Text (for Yahoo), ROR (for other search engines) and HTML (to be integrated within the site for ease of navigation for users).

    However convenient, WordPress, please refer to a very interesting plugin that automatically regenerates your sitemap every time you change or insert a new article on the blog! This plugin is called Google Sitemaps .

    The output you can see it here (after applying a style sheet) sitemap.xml !

    The site is now visited more frequently by search engines and therefore will gain more popularity!

    Tag: Categories: HowTo , HTML / CSS , Miscellaneous , Wordpress Tags:

    Robots.txt and bots do not poke around more!

    May 15, 2007 4 comments

    You've just created a new website? Do not forget to include your robots.txt in your root directory.

    What is the robots.txt file?
    (I quote verbatim from Wikipedia)

    The robots.txt file contains settings used by the crawler (also called spiders) to apply restrictions of analysis in the pages of a website.
    The web crawlers are programmed automatic software to search and indexing periodicals. In the first analysis phase of a website control the existence of the robots.txt file to apply any restrictions required by the site's webmaster.


    The fields are available:

    1. User-Agent, the value of this field contains the name of the robot that must comply with the restrictions. With the character '*' rule is applied to any robots;
    2. Disallow, the value of this field contains pages on the site that should be excluded from the robots while indexing. You can specify a specific URL or set of URLs belonging to a pattern.


    HERE you can see an example of a file robots.txt

    An example of writing can be this:
    User-agent: Googlebot-Image
    Disallow: /

    we say to the bot "Googlebot-Image", which scours the web in search of images, not to explore the whole root of our site ... and move on!

    Or again:
    User-agent: * # Applica a tutti i bot
    Disallow: /private_directory/ # Blocca /private_directory/
    Request-rate: 1/5 # Visita al massimo una pagina ogni 5 secondi
    Visit-time: 0700-0845 # Vista soo tra 7:00 AM e 8:45 AM UT (GMT)


    A face written robots.txt file, just save it in the root directory of the site and the bot will learn not to browse more on our website!

    And now you just have to verify that you have correctly written the "code" of our using the robots.txt validator web .

    Tag: Categories: HowTo , HTML / CSS , Miscellaneous Tags:

    CSSEdit: style sheets with a few clicks!

    May 15, 2007 No comments

    At the suggestion of Dade` I also tried this beautiful and handy software called CSSEdit .

    It is characterized by a innanziatutto graphica clean and very intuitive, but also for the ability to give way to those who have a thorough knowledge of CSS to modify all the details a style sheet.

    You insert the URL of the site that you want to change and you find yourself immediately place in front of the CSS written ... then you can speak with disarming simplicity of all that created and then continue the style sheet with sempici click! This software greatly simplifies the work of the webmaster for every change is displayed in real time in the preview screen (especially useful when working on the margin and on the various containers posizionamente 'div');

    Here's a screenshot of the program in action ...

    google_analytics
    [Click to enlarge]


    Tag: Categories: Curiosity , HTML / CSS , Software Tags: