Archive

Archive for the 'HTML / CSS'

WordPress 2.5.1

April 28, 2008 No comments

Release, as expected, after about a month of a version of the correction of the various bug fixes and security patches found after the long work for the 2.5 branch!

Usual upgrade procedure and usually a link to the download of WordPress !

smileys

WordPress: get_links deprecated (even wp_get_links)

May 17, 2007 3 comments

Upgraded from WordPress 2.1.3 to 2.2 convinced that everything would be successful, and so was ... or at least almost!

The theme that I used at the bottom of the page includes the insertion of my blogroll divided into two columns (respectively each category).

The code that allowed this has always been get_links (arguments), namely this:

', '

','', False,' name ', FALSE, 1,
-1, 1); ?>

but alas, using this old command get_links was created a strange space between each item and the other, thus making the output painful!

After smadonnato a bit 'in all languages ​​searching for information about on similar issue, I found that get_links as wp_get_links or older are considered deprecated, and should be used in their place a new command: wp_list_bookmarks (arguments).

Armed with good patience I am reading the documentation on the site of WordPress up to find that you have to use (in my case) this syntax:

category_name=0&title_li=&before= &after=

  • category_name=0&title_li=&before= &after=
  • &
    show_images = 1 & show_description = 0 & orderby = name '); ?>

    Reference for more information on the use of wp_list_bookmarks the official documentation .

    Tag: Categories: HowTo , HTML / CSS , Wordpress Tags:

    Sitemap for better placement in search engines!

    May 15, 2007 5 comments

    Maybe not all webmasters are aware that the sitemap (XML, TXT or ROR) is a powerful tool that allows you to increase the presence of your website in search engines. Thanks to the sitemap engine crawlers are more easily index the pages of the site and in the most correct way.

    A web tool that facilitates the creation of sitemap is definitely XML-Sitemaps.com that with a simple click allows you to create your sitemap.

    The types of maps are available XML (for Google), Text (for Yahoo), ROR (for other search engines) and HTML (to be integrated within the site for ease of navigation for users).

    However convenient, WordPress, please refer to a very interesting plugin that automatically regenerates your sitemap every time you change or insert a new article on the blog! This plugin is called Google Sitemaps .

    The output you can see it here (after applying a style sheet): sitemap.xml !

    The site will now be visited more frequently by search engines and therefore gain more popularity!

    Tag: Categories: HowTo , Html / Css , Miscellaneous , Wordpress Tags:

    Robots.txt and bots not curiosano more!

    May 15, 2007 4 comments

    You've just created a new site? Do not forget to enter your robots.txt in your root directory.

    What is the robots.txt file?
    (I quote verbatim from Wikipedia)

    The robots.txt file contains the rules used by the crawler (also called spiders) to enforce restrictions of analysis on the pages of a website.
    The web crawlers are programmed automatic software to search and indexing periodicals. In the first stage of analysis of a website control the existence of the robots.txt file to apply any restrictions required by the site's webmaster.


    The fields available are two:

    1. User-Agent, the value of this field contains the name of the robot that must comply with the restrictions. With the character '*' rule is applied to any robot;
    2. Disallow, the value of this field contains pages on the site that should be excluded from the robot during indexing. It may indicate a specific URL or a series of URLs belonging to a pattern.


    HERE you can see an example of a file robots.txt

    An example of this can be written:
    User-agent: Googlebot-Image
    Disallow: /

    in which we tell the bot "Googlebot-Image", which scours the web in search of images, not to visit the whole root of our site ... and move on!

    Or again:
    User-agent: * # Applica a tutti i bot
    Disallow: /private_directory/ # Blocca /private_directory/
    Request-rate: 1/5 # Visita al massimo una pagina ogni 5 secondi
    Visit-time: 0700-0845 # Vista soo tra 7:00 AM e 8:45 AM UT (GMT)


    A face written robots.txt file, just save it in the root directory of the site and the bot will learn not to poke around more to our site!

    And now we have to verify that you have correctly written the "code" of our using the robots.txt validator web .

    Tag: Categories: HowTo , Html / Css , Miscellaneous Tags:

    CSSEdit: style sheets in a few clicks!

    May 15, 2007 No comments

    At the suggestion of Dade ` I also tried this beautiful and handy software called CSSEdit .

    It is characterized by a innanziatutto graphica clean and very intuitive, but especially for the ability to give way to those who have a thorough knowledge of CSS to modify all the details a style sheet.

    You insert the URL of the site that you want to change and you find yourself in front of the CSS written immediately place ... then you can intervene with disarming simplicity of all that created and then continue the style sheet with sempici click! It is a software that greatly simplifies the work of the webmaster because any change is shown in real time in the preview screen (especially useful when working on the margin and on the various containers posizionamente "div")!

    Here a screenshot of the program in action ...

    google_analytics
    [Click to enlarge]


    Tag: Categories: Curiosity , Html / Css , Software Tags: