This is used to work with php, by installing the xampp-win32-1.7.4-VC6-installer.exe we can install apache, php and mysql in single installer.
Click here to download
Thursday, July 7, 2011
Monday, April 11, 2011
Android Books - you can download android ebooks
You can download ebooks
Beginning Android 2
Begin the journey toward your own
successful Android 2 applications
Mark L. Murphy
Pro Android
Sayed Y. Hashimi and Satya Komatineni
Unlocking Android
A developer guide
Frank Ableson
Charlie Collins
Robi Sen
FOREWORD BY DICK WALL
The Busy Codes Guide to
Android Developement
Mark L. Murphy
Beginning Android
Master Android from first principles
and begin the journey toward your
own successful Android applications!
Mark L. Murphy
Source http://android-codes-examples.blogspot.com/2011/04/android-books-you-can-download-android.html
Have a good day.
Beginning Android 2
Begin the journey toward your own
successful Android 2 applications
Mark L. Murphy
Pro Android
Sayed Y. Hashimi and Satya Komatineni
Unlocking Android
A developer guide
Frank Ableson
Charlie Collins
Robi Sen
FOREWORD BY DICK WALL
The Busy Codes Guide to
Android Developement
Mark L. Murphy
Beginning Android
Master Android from first principles
and begin the journey toward your
own successful Android applications!
Mark L. Murphy
Source http://android-codes-examples.blogspot.com/2011/04/android-books-you-can-download-android.html
Have a good day.
Tuesday, March 1, 2011
SEO Tips
Keywords
1.Keywords in <title> tag
This is one of the most important places to have a keyword because what is written inside the <title> tag shows in search results as your page title. The title tag must be short (6 or 7 words at most) and the the keyword must be near the beginning.
2.Keywords in URL
Keywords in URLs help a lot – e.g. – http://domainname.com/seo-services.html, where "SEO services" is the keyword phrase you attempt to rank well for. But if you don’t have the keywords in other parts of the document, don’t rely on having them in the URL.
3.Keyword density in document text
Another very important factor you need to check. 3-7 % for major keywords is best, 1-2 for minor. Keyword density of over 10% is suspicious and looks more like keyword stuffing, than a naturally written text.
4.Keywords in anchor text
Also very important, especially for the anchor text of inbound links, because if you have the keyword in the anchor text in a link from another site, this is regarded as getting a vote from this site not only about your site in general, but about the keyword in particular.
5.Keywords in headings (<H1>, <H2>, etc. tags)
One more place where keywords count a lot. But beware that your page has actual text about the particular keyword.
6.Keywords in the beginning of a document
Also counts, though not as much as anchor text, title tag or headings. However, have in mind that the beginning of a document does not necessarily mean the first paragraph – for instance if you use tables, the first paragraph of text might be in the second half of the table.
7.Keywords in <alt> tags
Spiders don’t read images but they do read their textual descriptions in the <alt> tag, so if you have images on your page, fill in the <alt> tag with some keywords about them.
8.Keywords in metatags
Less and less important, especially for Google. Yahoo! and Bing still rely on them, so if you are optimizing for Yahoo! or Bing, fill these tags properly. In any case, filling these tags properly will not hurt, so do it.
9.Keyword proximity
Keyword proximity measures how close in the text the keywords are. It is best if they are immediately one after the other (e.g. "dog food"), with no other words between them. For instance, if you have "dog" in the first paragraph and "food" in the third paragraph, this also counts but not as much as having the phrase "dog food" without any other words in between. Keyword proximity is applicable for keyword phrases that consist of 2 or more words.
10.Keyword phrases
In addition to keywords, you can optimize for keyword phrases that consist of several words – e.g. "SEO services". It is best when the keyword phrases you optimize for are popular ones, so you can get a lot of exact matches of the search string but sometimes it makes sense to optimize for 2 or 3 separate keywords ("SEO" and "services") than for one phrase that might occasionally get an exact match.
11.Secondary keywords
Optimizing for secondary keywords can be a golden mine because when everybody else is optimizing for the most popular keywords, there will be less competition (and probably more hits) for pages that are optimized for the minor words. For instance, "real estate new jersey" might have thousand times less hits than "real estate" only but if you are operating in New Jersey, you will get less but considerably better targeted traffic.
12.Keyword stemming
For English this is not so much of a factor because words that stem from the same root (e.g. dog, dogs, doggy, etc.) are considered related and if you have "dog" on your page, you will get hits for "dogs" and "doggy" as well, but for other languages keywords stemming could be an issue because different words that stem from the same root are considered as not related and you might need to optimize for all of them.
13.Synonyms
Optimizing for synonyms of the target keywords, in addition to the main keywords. This is good for sites in English, for which search engines are smart enough to use synonyms as well, when ranking sites but for many other languages synonyms are not taken into account, when calculating rankings and relevancy.
14.Keyword Mistypes
Spelling errors are very frequent and if you know that your target keywords have popular misspellings or alternative spellings (i.e. Christmas and Xmas), you might be tempted to optimize for them. Yes, this might get you some more traffic but having spelling mistakes on your site does not make a good impression, so you’d better don’t do it, or do it only in the metatags.
15.Keyword dilution
When you are optimizing for an excessive amount of keywords, especially unrelated ones, this will affect the performance of all your keywords and even the major ones will be lost (diluted) in the text.
16.Keyword stuffing
Any artificially inflated keyword density (10% and over) is keyword stuffing and you risk getting banned from search engines.
Links – internal, inbound, outbound
17.Anchor text of inbound links
As discussed in the Keywords section, this is one of the most important factors for good rankings. It is best if you have a keyword in the anchor text but even if you don’t, it is still OK.
18.Origin of inbound links
Besides the anchor text, it is important if the site that links to you is a reputable one or not. Generally sites with greater Google PR are considered reputable.
19.Links from similar sites
Having links from similar sites is very, very useful. It indicates that the competition is voting for you and you are popular within your topical community.
20.Links from .edu and .gov sites
These links are precious because .edu and .gov sites are more reputable than .com. .biz, .info, etc. domains. Additionally, such links are hard to obtain.
21.Number of backlinks
Generally the more, the better. But the reputation of the sites that link to you is more important than their number. Also important is their anchor text, is there a keyword in it, how old are they, etc.
22.Anchor text of internal links
This also matters, though not as much as the anchor text of inbound links.
23.Around-the-anchor text
The text that is immediately before and after the anchor text also matters because it further indicates the relevance of the link – i.e. if the link is artificial or it naturally flows in the text.
24.Age of inbound links
The older, the better. Getting many new links in a short time suggests buying them.
25.Links from directories
Great, though it strongly depends on which directories. Being listed in DMOZ, Yahoo Directory and similar directories is a great boost for your ranking but having tons of links from PR0 directories is useless and it can even be regarded as link spamming, if you have hundreds or thousands of such links.
26.Number of outgoing links on the page that links to you
The fewer, the better for you because this way your link looks more important.
27.Named anchors
Named anchors (the target place of internal links) are useful for internal navigation but are also useful for SEO because you stress additionally that a particular page, paragraph or text is important. In the code, named anchors look like this: <A href= "#dogs">Read about dogs</A> and "#dogs" is the named anchor.
28.IP address of inbound link
Google denies that they discriminate against links that come from the same IP address or C class of addresses, so for Google the IP address can be considered neutral to the weight of inbound links. However, Bing and Yahoo! may discard links from the same IPs or IP classes, so it is always better to get links from different IPs.
29.Inbound links from link farms and other suspicious sites
This does not affect you in any way, provided that the links are not reciprocal. The idea is that it is beyond your control to define what a link farm links to, so you don’t get penalized when such sites link to you because this is not your fault but in any case you’d better stay away from link farms and similar suspicious sites.
30.Many outgoing links
Google does not like pages that consists mainly of links, so you’d better keep them under 100 per page. Having many outgoing links does not get you any benefits in terms of ranking and could even make your situation worse.
31.Excessive linking, link spamming
It is bad for your rankings, when you have many links to/from the same sites (even if it is not a cross- linking scheme or links to bad neighbors) because it suggests link buying or at least spamming. In the best case only some of the links are taken into account for SEO rankings.
32.Outbound links to link farms and other suspicious sites
Unlike inbound links from link farms and other suspicious sites, outbound links to bad neighbors can drown you. You need periodically to check the status of the sites you link to because sometimes good sites become bad neighbors and vice versa.
33.Cross-linking
Cross-linking occurs when site A links to site B, site B links to site C and site C links back to site A. This is the simplest example but more complex schemes are possible. Cross-linking looks like disguised reciprocal link trading and is penalized.
34.Single pixel links
when you have a link that is a pixel or so wide it is invisible for humans, so nobody will click on it and it is obvious that this link is an attempt to manipulate search engines.
Metatags
35.<Description> metatag
Metatags are becoming less and less important but if there are metatags that still matter, these are the <description> and <keywords> ones. Use the <Description> metatag to write the description of your site. Besides the fact that metatags still rock on Bing and Yahoo!, the <Description> metatag has one more advantage – it sometimes pops in the description of your site in search results.
36.<Keywords> metatag
The <Keywords> metatag also matters, though as all metatags it gets almost no attention from Google and some attention from Bing and Yahoo! Keep the metatag reasonably long – 10 to 20 keywords at most. Don’t stuff the <Keywords> tag with keywords that you don’t have on the page, this is bad for your rankings.
37.<Language> metatag
If your site is language-specific, don’t leave this tag empty. Search engines have more sophisticated ways of determining the language of a page than relying on the <language>metatag but they still consider it.
38.<Refresh> metatag
The <Refresh> metatag is one way to redirect visitors from your site to another. Only do it if you have recently migrated your site to a new domain and you need to temporarily redirect visitors. When used for a long time, the <refresh> metatag is regarded as unethical practice and this can hurt your ratings. In any case, redirecting through 301 is much better.
Content
39.Unique content
Having more content (relevant content, which is different from the content on other sites both in wording and topics) is a real boost for your site’s rankings.
40.Frequency of content change
Frequent changes are favored. It is great when you constantly add new content but it is not so great when you only make small updates to existing content.
41.Keywords font size
When a keyword in the document text is in a larger font size in comparison to other on-page text, this makes it more noticeable, so therefore it is more important than the rest of the text. The same applies to headings (<h1>, <h2>, etc.), which generally are in larger font size than the rest of the text.
42.Keywords formatting
Bold and italic are another way to emphasize important words and phrases. However, use bold, italic and larger font sizes within reason because otherwise you might achieve just the opposite effect.
43.Age of document
Recent documents (or at least regularly updated ones) are favored.
44.File size
Generally long pages are not favored, or at least you can achieve better rankings if you have 3 short rather than 1 long page on a given topic, so split long pages into multiple smaller ones.
45.Content separation
From a marketing point of view content separation (based on IP, browser type, etc.) might be great but for SEO it is bad because when you have one URL and differing content, search engines get confused what the actual content of the page is.
46.Poor coding and design
Search engines say that they do not want poorly designed and coded sites, though there are hardly sites that are banned because of messy code or ugly images but when the design and/or coding of a site is poor, the site might not be indexable at all, so in this sense poor code and design can harm you a lot.
47.Illegal Content
Using other people’s copyrighted content without their permission or using content that promotes legal violations can get you kicked out of search engines.
48.Invisible text
This is a black hat SEO practice and when spiders discover that you have text specially for them but not for humans, don’t be surprised by the penalty.
49.Cloaking
Cloaking is another illegal technique, which partially involves content separation because spiders see one page (highly-optimized, of course), and everybody else is presented with another version of the same page.
50.Doorway pages
Creating pages that aim to trick spiders that your site is a highly-relevant one when it is not, is another way to get the kick from search engines.
51.Duplicate content
When you have the same content on several pages on the site, this will not make your site look larger because the duplicate content penalty kicks in. To a lesser degree duplicate content applies to pages that reside on other sites but obviously these cases are not always banned – i.e. article directories or mirror sites do exist and prosper.
Visual Extras and SEO
52.JavaScript
If used wisely, it will not hurt. But if your main content is displayed through JavaScript, this makes it more difficult for spiders to follow and if JavaScript code is a mess and spiders can’t follow it, this will definitely hurt your ratings.
53.Images in text
Having a text-only site is so boring but having many images and no text is a SEO sin. Always provide in the <alt> tag a meaningful description of an image but don’t stuff it with keywords or irrelevant information.
54.Podcasts and videos
Podcasts and videos are becoming more and more popular but as with all non-textual goodies, search engines can’t read them, so if you don’t have the tapescript of the podcast or the video, it is as if the podcast or movie is not there because it will not be indexed by search engines.
55.Images instead of text links
Using images instead of text links is bad, especially when you don’t fill in the <alt> tag. But even if you fill in the <alt> tag, it is not the same as having a bold, underlined, 16-pt. link, so use images for navigation only if this is really vital for the graphic layout of your site.
56.Frames
Frames are very, very bad for SEO. Avoid using them unless really necessary.
57.Flash
Spiders don’t index the content of Flash movies, so if you use Flash on your site, don’t forget to give it an alternative textual description.
58.A Flash home page
Fortunately this epidemic disease seems to have come to an end. Having a Flash home page (and sometimes whole sections of your site) and no HTML version, is a SEO suicide.
Domains, URLs, Web Mastery
59.Keyword-rich URLs and filenames
A very important factor, especially for Yahoo! and Bing.
60.Site Accessibility
Another fundamental issue, which that is often neglected. If the site (or separate pages) is unaccessible because of broken links, 404 errors, password-protected areas and other similar reasons, then the site simply can’t be indexed.
61.Sitemap
It is great to have a complete and up-to-date sitemap, spiders love it, no matter if it is a plain old HTML sitemap or the special Google sitemap format.
62.Site size
Spiders love large sites, so generally it is the bigger, the better. However, big sites become user-unfriendly and difficult to navigate, so sometimes it makes sense to separate a big site into a couple of smaller ones. On the other hand, there are hardly sites that are penalized because they are 10,000+ pages, so don’t split your size in pieces only because it is getting larger and larger.
63.Site age
Similarly to wine, older sites are respected more. The idea is that an old, established site is more trustworthy (they have been around and are here to stay) than a new site that has just poped up and might soon disappear.
64.Site theme
It is not only keywords in URLs and on page that matter. The site theme is even more important for good ranking because when the site fits into one theme, this boosts the rankings of all its pages that are related to this theme.
65.File Location on Site
File location is important and files that are located in the root directory or near it tend to rank better than files that are buried 5 or more levels below.
66.Domains versus subdomains, separate domains
Having a separate domain is better – i.e. instead of having blablabla.blogspot.com, register a separate blablabla.com domain.
67.Top-level domains (TLDs)
Not all TLDs are equal. There are TLDs that are better than others. For instance, the most popular TLD – .com – is much better than .ws, .biz, or .info domains but (all equal) nothing beats an old .edu or .org domain.
68.Hyphens in URLs
Hyphens between the words in an URL increase readability and help with SEO rankings. This applies both to hyphens in domain names and in the rest of the URL.
69.URL length
Generally doesn’t matter but if it is a very long URL-s, this starts to look spammy, so avoid having more than 10 words in the URL (3 or 4 for the domain name itself and 6 or 7 for the rest of address is acceptable).
70.IP address
Could matter only for shared hosting or when a site is hosted with a free hosting provider, when the IP or the whole C-class of IP addresses is blacklisted due to spamming or other illegal practices.
71.Adsense will boost your ranking
Adsense is not related in any way to SEO ranking. Google will definitely not give you a ranking bonus because of hosting Adsense ads. Adsense might boost your income but this has nothing to do with your search rankings.
72.Adwords will boost your ranking
Similarly to Adsense, Adwords has nothing to do with your search rankings. Adwords will bring more traffic to your site but this will not affect your rankings in whatsoever way.
73.Hosting downtime
Hosting downtime is directly related to accessibility because if a site is frequently down, it can’t be indexed. But in practice this is a factor only if your hosting provider is really unreliable and has less than 97-98% uptime.
74.Dynamic URLs
Spiders prefer static URLs, though you will see many dynamic pages on top positions. Long dynamic URLs (over 100 characters) are really bad and in any case you’d better use a tool to rewrite dynamic URLs in something more human- and SEO-friendly.
75.Session IDs
This is even worse than dynamic URLs. Don’t use session IDs for information that you’d like to be indexed by spiders.
76.Bans in robots.txt
If indexing of a considerable portion of the site is banned, this is likely to affect the nonbanned part as well because spiders will come less frequently to a "noindex" site.
77.Redirects (301 and 302)
When not applied properly, redirects can hurt a lot – the target page might not open, or worse – a redirect can be regarded as a black hat technique, when the visitor is immediately taken to a different page.
source : http://www.webconfs.com/15-minute-seo.php
1.Keywords in <title> tag
This is one of the most important places to have a keyword because what is written inside the <title> tag shows in search results as your page title. The title tag must be short (6 or 7 words at most) and the the keyword must be near the beginning.
2.Keywords in URL
Keywords in URLs help a lot – e.g. – http://domainname.com/seo-services.html, where "SEO services" is the keyword phrase you attempt to rank well for. But if you don’t have the keywords in other parts of the document, don’t rely on having them in the URL.
3.Keyword density in document text
Another very important factor you need to check. 3-7 % for major keywords is best, 1-2 for minor. Keyword density of over 10% is suspicious and looks more like keyword stuffing, than a naturally written text.
4.Keywords in anchor text
Also very important, especially for the anchor text of inbound links, because if you have the keyword in the anchor text in a link from another site, this is regarded as getting a vote from this site not only about your site in general, but about the keyword in particular.
5.Keywords in headings (<H1>, <H2>, etc. tags)
One more place where keywords count a lot. But beware that your page has actual text about the particular keyword.
6.Keywords in the beginning of a document
Also counts, though not as much as anchor text, title tag or headings. However, have in mind that the beginning of a document does not necessarily mean the first paragraph – for instance if you use tables, the first paragraph of text might be in the second half of the table.
7.Keywords in <alt> tags
Spiders don’t read images but they do read their textual descriptions in the <alt> tag, so if you have images on your page, fill in the <alt> tag with some keywords about them.
8.Keywords in metatags
Less and less important, especially for Google. Yahoo! and Bing still rely on them, so if you are optimizing for Yahoo! or Bing, fill these tags properly. In any case, filling these tags properly will not hurt, so do it.
9.Keyword proximity
Keyword proximity measures how close in the text the keywords are. It is best if they are immediately one after the other (e.g. "dog food"), with no other words between them. For instance, if you have "dog" in the first paragraph and "food" in the third paragraph, this also counts but not as much as having the phrase "dog food" without any other words in between. Keyword proximity is applicable for keyword phrases that consist of 2 or more words.
10.Keyword phrases
In addition to keywords, you can optimize for keyword phrases that consist of several words – e.g. "SEO services". It is best when the keyword phrases you optimize for are popular ones, so you can get a lot of exact matches of the search string but sometimes it makes sense to optimize for 2 or 3 separate keywords ("SEO" and "services") than for one phrase that might occasionally get an exact match.
11.Secondary keywords
Optimizing for secondary keywords can be a golden mine because when everybody else is optimizing for the most popular keywords, there will be less competition (and probably more hits) for pages that are optimized for the minor words. For instance, "real estate new jersey" might have thousand times less hits than "real estate" only but if you are operating in New Jersey, you will get less but considerably better targeted traffic.
12.Keyword stemming
For English this is not so much of a factor because words that stem from the same root (e.g. dog, dogs, doggy, etc.) are considered related and if you have "dog" on your page, you will get hits for "dogs" and "doggy" as well, but for other languages keywords stemming could be an issue because different words that stem from the same root are considered as not related and you might need to optimize for all of them.
13.Synonyms
Optimizing for synonyms of the target keywords, in addition to the main keywords. This is good for sites in English, for which search engines are smart enough to use synonyms as well, when ranking sites but for many other languages synonyms are not taken into account, when calculating rankings and relevancy.
14.Keyword Mistypes
Spelling errors are very frequent and if you know that your target keywords have popular misspellings or alternative spellings (i.e. Christmas and Xmas), you might be tempted to optimize for them. Yes, this might get you some more traffic but having spelling mistakes on your site does not make a good impression, so you’d better don’t do it, or do it only in the metatags.
15.Keyword dilution
When you are optimizing for an excessive amount of keywords, especially unrelated ones, this will affect the performance of all your keywords and even the major ones will be lost (diluted) in the text.
16.Keyword stuffing
Any artificially inflated keyword density (10% and over) is keyword stuffing and you risk getting banned from search engines.
Links – internal, inbound, outbound
17.Anchor text of inbound links
As discussed in the Keywords section, this is one of the most important factors for good rankings. It is best if you have a keyword in the anchor text but even if you don’t, it is still OK.
18.Origin of inbound links
Besides the anchor text, it is important if the site that links to you is a reputable one or not. Generally sites with greater Google PR are considered reputable.
19.Links from similar sites
Having links from similar sites is very, very useful. It indicates that the competition is voting for you and you are popular within your topical community.
20.Links from .edu and .gov sites
These links are precious because .edu and .gov sites are more reputable than .com. .biz, .info, etc. domains. Additionally, such links are hard to obtain.
21.Number of backlinks
Generally the more, the better. But the reputation of the sites that link to you is more important than their number. Also important is their anchor text, is there a keyword in it, how old are they, etc.
22.Anchor text of internal links
This also matters, though not as much as the anchor text of inbound links.
23.Around-the-anchor text
The text that is immediately before and after the anchor text also matters because it further indicates the relevance of the link – i.e. if the link is artificial or it naturally flows in the text.
24.Age of inbound links
The older, the better. Getting many new links in a short time suggests buying them.
25.Links from directories
Great, though it strongly depends on which directories. Being listed in DMOZ, Yahoo Directory and similar directories is a great boost for your ranking but having tons of links from PR0 directories is useless and it can even be regarded as link spamming, if you have hundreds or thousands of such links.
26.Number of outgoing links on the page that links to you
The fewer, the better for you because this way your link looks more important.
27.Named anchors
Named anchors (the target place of internal links) are useful for internal navigation but are also useful for SEO because you stress additionally that a particular page, paragraph or text is important. In the code, named anchors look like this: <A href= "#dogs">Read about dogs</A> and "#dogs" is the named anchor.
28.IP address of inbound link
Google denies that they discriminate against links that come from the same IP address or C class of addresses, so for Google the IP address can be considered neutral to the weight of inbound links. However, Bing and Yahoo! may discard links from the same IPs or IP classes, so it is always better to get links from different IPs.
29.Inbound links from link farms and other suspicious sites
This does not affect you in any way, provided that the links are not reciprocal. The idea is that it is beyond your control to define what a link farm links to, so you don’t get penalized when such sites link to you because this is not your fault but in any case you’d better stay away from link farms and similar suspicious sites.
30.Many outgoing links
Google does not like pages that consists mainly of links, so you’d better keep them under 100 per page. Having many outgoing links does not get you any benefits in terms of ranking and could even make your situation worse.
31.Excessive linking, link spamming
It is bad for your rankings, when you have many links to/from the same sites (even if it is not a cross- linking scheme or links to bad neighbors) because it suggests link buying or at least spamming. In the best case only some of the links are taken into account for SEO rankings.
32.Outbound links to link farms and other suspicious sites
Unlike inbound links from link farms and other suspicious sites, outbound links to bad neighbors can drown you. You need periodically to check the status of the sites you link to because sometimes good sites become bad neighbors and vice versa.
33.Cross-linking
Cross-linking occurs when site A links to site B, site B links to site C and site C links back to site A. This is the simplest example but more complex schemes are possible. Cross-linking looks like disguised reciprocal link trading and is penalized.
34.Single pixel links
when you have a link that is a pixel or so wide it is invisible for humans, so nobody will click on it and it is obvious that this link is an attempt to manipulate search engines.
Metatags
35.<Description> metatag
Metatags are becoming less and less important but if there are metatags that still matter, these are the <description> and <keywords> ones. Use the <Description> metatag to write the description of your site. Besides the fact that metatags still rock on Bing and Yahoo!, the <Description> metatag has one more advantage – it sometimes pops in the description of your site in search results.
36.<Keywords> metatag
The <Keywords> metatag also matters, though as all metatags it gets almost no attention from Google and some attention from Bing and Yahoo! Keep the metatag reasonably long – 10 to 20 keywords at most. Don’t stuff the <Keywords> tag with keywords that you don’t have on the page, this is bad for your rankings.
37.<Language> metatag
If your site is language-specific, don’t leave this tag empty. Search engines have more sophisticated ways of determining the language of a page than relying on the <language>metatag but they still consider it.
38.<Refresh> metatag
The <Refresh> metatag is one way to redirect visitors from your site to another. Only do it if you have recently migrated your site to a new domain and you need to temporarily redirect visitors. When used for a long time, the <refresh> metatag is regarded as unethical practice and this can hurt your ratings. In any case, redirecting through 301 is much better.
Content
39.Unique content
Having more content (relevant content, which is different from the content on other sites both in wording and topics) is a real boost for your site’s rankings.
40.Frequency of content change
Frequent changes are favored. It is great when you constantly add new content but it is not so great when you only make small updates to existing content.
41.Keywords font size
When a keyword in the document text is in a larger font size in comparison to other on-page text, this makes it more noticeable, so therefore it is more important than the rest of the text. The same applies to headings (<h1>, <h2>, etc.), which generally are in larger font size than the rest of the text.
42.Keywords formatting
Bold and italic are another way to emphasize important words and phrases. However, use bold, italic and larger font sizes within reason because otherwise you might achieve just the opposite effect.
43.Age of document
Recent documents (or at least regularly updated ones) are favored.
44.File size
Generally long pages are not favored, or at least you can achieve better rankings if you have 3 short rather than 1 long page on a given topic, so split long pages into multiple smaller ones.
45.Content separation
From a marketing point of view content separation (based on IP, browser type, etc.) might be great but for SEO it is bad because when you have one URL and differing content, search engines get confused what the actual content of the page is.
46.Poor coding and design
Search engines say that they do not want poorly designed and coded sites, though there are hardly sites that are banned because of messy code or ugly images but when the design and/or coding of a site is poor, the site might not be indexable at all, so in this sense poor code and design can harm you a lot.
47.Illegal Content
Using other people’s copyrighted content without their permission or using content that promotes legal violations can get you kicked out of search engines.
48.Invisible text
This is a black hat SEO practice and when spiders discover that you have text specially for them but not for humans, don’t be surprised by the penalty.
49.Cloaking
Cloaking is another illegal technique, which partially involves content separation because spiders see one page (highly-optimized, of course), and everybody else is presented with another version of the same page.
50.Doorway pages
Creating pages that aim to trick spiders that your site is a highly-relevant one when it is not, is another way to get the kick from search engines.
51.Duplicate content
When you have the same content on several pages on the site, this will not make your site look larger because the duplicate content penalty kicks in. To a lesser degree duplicate content applies to pages that reside on other sites but obviously these cases are not always banned – i.e. article directories or mirror sites do exist and prosper.
Visual Extras and SEO
52.JavaScript
If used wisely, it will not hurt. But if your main content is displayed through JavaScript, this makes it more difficult for spiders to follow and if JavaScript code is a mess and spiders can’t follow it, this will definitely hurt your ratings.
53.Images in text
Having a text-only site is so boring but having many images and no text is a SEO sin. Always provide in the <alt> tag a meaningful description of an image but don’t stuff it with keywords or irrelevant information.
54.Podcasts and videos
Podcasts and videos are becoming more and more popular but as with all non-textual goodies, search engines can’t read them, so if you don’t have the tapescript of the podcast or the video, it is as if the podcast or movie is not there because it will not be indexed by search engines.
55.Images instead of text links
Using images instead of text links is bad, especially when you don’t fill in the <alt> tag. But even if you fill in the <alt> tag, it is not the same as having a bold, underlined, 16-pt. link, so use images for navigation only if this is really vital for the graphic layout of your site.
56.Frames
Frames are very, very bad for SEO. Avoid using them unless really necessary.
57.Flash
Spiders don’t index the content of Flash movies, so if you use Flash on your site, don’t forget to give it an alternative textual description.
58.A Flash home page
Fortunately this epidemic disease seems to have come to an end. Having a Flash home page (and sometimes whole sections of your site) and no HTML version, is a SEO suicide.
Domains, URLs, Web Mastery
59.Keyword-rich URLs and filenames
A very important factor, especially for Yahoo! and Bing.
60.Site Accessibility
Another fundamental issue, which that is often neglected. If the site (or separate pages) is unaccessible because of broken links, 404 errors, password-protected areas and other similar reasons, then the site simply can’t be indexed.
61.Sitemap
It is great to have a complete and up-to-date sitemap, spiders love it, no matter if it is a plain old HTML sitemap or the special Google sitemap format.
62.Site size
Spiders love large sites, so generally it is the bigger, the better. However, big sites become user-unfriendly and difficult to navigate, so sometimes it makes sense to separate a big site into a couple of smaller ones. On the other hand, there are hardly sites that are penalized because they are 10,000+ pages, so don’t split your size in pieces only because it is getting larger and larger.
63.Site age
Similarly to wine, older sites are respected more. The idea is that an old, established site is more trustworthy (they have been around and are here to stay) than a new site that has just poped up and might soon disappear.
64.Site theme
It is not only keywords in URLs and on page that matter. The site theme is even more important for good ranking because when the site fits into one theme, this boosts the rankings of all its pages that are related to this theme.
65.File Location on Site
File location is important and files that are located in the root directory or near it tend to rank better than files that are buried 5 or more levels below.
66.Domains versus subdomains, separate domains
Having a separate domain is better – i.e. instead of having blablabla.blogspot.com, register a separate blablabla.com domain.
67.Top-level domains (TLDs)
Not all TLDs are equal. There are TLDs that are better than others. For instance, the most popular TLD – .com – is much better than .ws, .biz, or .info domains but (all equal) nothing beats an old .edu or .org domain.
68.Hyphens in URLs
Hyphens between the words in an URL increase readability and help with SEO rankings. This applies both to hyphens in domain names and in the rest of the URL.
69.URL length
Generally doesn’t matter but if it is a very long URL-s, this starts to look spammy, so avoid having more than 10 words in the URL (3 or 4 for the domain name itself and 6 or 7 for the rest of address is acceptable).
70.IP address
Could matter only for shared hosting or when a site is hosted with a free hosting provider, when the IP or the whole C-class of IP addresses is blacklisted due to spamming or other illegal practices.
71.Adsense will boost your ranking
Adsense is not related in any way to SEO ranking. Google will definitely not give you a ranking bonus because of hosting Adsense ads. Adsense might boost your income but this has nothing to do with your search rankings.
72.Adwords will boost your ranking
Similarly to Adsense, Adwords has nothing to do with your search rankings. Adwords will bring more traffic to your site but this will not affect your rankings in whatsoever way.
73.Hosting downtime
Hosting downtime is directly related to accessibility because if a site is frequently down, it can’t be indexed. But in practice this is a factor only if your hosting provider is really unreliable and has less than 97-98% uptime.
74.Dynamic URLs
Spiders prefer static URLs, though you will see many dynamic pages on top positions. Long dynamic URLs (over 100 characters) are really bad and in any case you’d better use a tool to rewrite dynamic URLs in something more human- and SEO-friendly.
75.Session IDs
This is even worse than dynamic URLs. Don’t use session IDs for information that you’d like to be indexed by spiders.
76.Bans in robots.txt
If indexing of a considerable portion of the site is banned, this is likely to affect the nonbanned part as well because spiders will come less frequently to a "noindex" site.
77.Redirects (301 and 302)
When not applied properly, redirects can hurt a lot – the target page might not open, or worse – a redirect can be regarded as a black hat technique, when the visitor is immediately taken to a different page.
source : http://www.webconfs.com/15-minute-seo.php
Module Rewrite – URL Rewriting Guide
Module Rewrite
Welcome to mod_rewrite, voodoo of URL manipulation.
This document describes how one can use Apache’s mod_rewrite to solve typical URL based problems webmasters are usually confronted with in practice. The Apache module mod_rewrite is a module which provides a powerful way to do URL manipulations. With it you can nearly do all types of URL manipulations you ever dreamed about. The price you have to pay is to accept complexity, because mod_rewrite is not easy to understand and use for the beginner.
NOTE: Depending on your server configuration it can be necessary to change the examples for your situation. Always try to understand what it really does before you use it. Bad use would lead to deadloops and will hang the server.
The most example’s can be used in the .htaccess file while other ones only in the Apache htppd.conf file.
RewriteCond
The RewriteCond directive defines a rule condition. Preserve a RewriteRule with one or more RewriteCond directives. The following rewriting rule is only used if its pattern matches the current state of the URI and if these additional conditions apply too.
You can set special flags for condition pattern by appending a third argument to the RewriteCond directive. Flags is a comma-separated list of the following flags:
[NC] (No Case)
This makes the condition pattern case insensitive, no difference between ‘A-Z’ and ‘a-z’.
[OR] (OR next condition)
Used to combinate rule conditions with a OR.
RewriteRule
The RewriteRule directive is the real rewriting.
You can set special flags for condition pattern by appending a third argument to the RewriteCond directive. Flags is a comma-separated list of the following flags:
[R] (force Redirect)
Redirect the URL to a external redirection. Send the HTTP response, 302 (MOVED TEMPORARILY).
[F] (force URL to be Forbidden)
Forces the current URL to be forbidden. Send the HTTP response, 403 (FORBIDDEN).
[G] (force URL to be Gone)
Forces the current URL to be gone. Send the HTTP response, 410 (GONE).
[L] (last rule)
Forces the rewriting processing to stop here and don’t apply any more rewriting rules.
[P] (force proxy)
This flag forces the current URL as a proxy request and put through the proxy module mod_proxy.
Regular expressions
Some hints about the syntax of regular expressions:
Text:
. Any single character
[chars] One of chars
[^chars] None of chars
text1|text2 text1 or text2
Quantifiers:
? 0 or 1 of the preceding text
* 0 or N of the preceding text (N > 0)
+ 1 or N of the preceding text (N > 1)
Grouping:
(text) Grouping of text
Anchors:
^ Start of line anchor
$ End of line anchor
Escaping:
\ char escape that particular char
Condition pattern
There are some special variants of CondPatterns. Instead of real regular expression strings you can also use one of the following:
< Condition (is lower than Condition)
Treats the Condition as a string and compares it to String. True if String is lower than Condition.
> Condition (is greater than Condition)
Treats the Condition as a string and compares it to String. True if String is greater than CondPattern.
= Condition (is equal to Condition)
Treats the Condition as a string and compares it to String. True if String is equal to CondPattern.
-d (is directory)
Treats the String as a pathname and tests if it exists and is a directory.
-f (is regular file)
Treats the String as a pathname and tests if it exists and is a regular file.
-s (is regular file with size)
Treats the String as a pathname and tests if it exists and is a regular file with size greater than zero.
-l (is symbolic link)
Treats the String as a pathname and tests if it exists and is a symbolic link.
-F (is existing file via sub request)
Checks if String is a valid file and accessible via all the server’s currently configured access controls for that path. Use it with care because it decreases your servers performance!
-U (is existing URL via sub request)
Checks if String is a valid URL and accessible via all the server’s currently configured access controls for that path. Use it with care because it decreases your servers performance!
NOTE: You can prefix the pattern string with a ‘!’ character (exclamation mark) to specify a non-matching pattern.
Protecting your images and files from linking
DESCRIPTION: In some cases other webmasters are linking to your download files or using images, hosted on your server as inline-images on their pages.
RewriteEngine On
RewriteCond %{HTTP_REFERER} !^$ [NC]
RewriteCond %{HTTP_REFERER} !^http://domain.com [NC]
RewriteCond %{HTTP_REFERER} !^http://www.domain.com [NC]
RewriteCond %{HTTP_REFERER} !^http://212.204.218.80 [NC]
RewriteRule ^.*$ http://www.domain.com/ [R,L]
EXPLAIN: In this case are the visitors redirect to http://www.domain.com/ if the hyperlink has not arrived from http://domain.com, http://www.domain.com or http://212.204.218.80.
Redirect visitor by domain name
DESCRIPTION: In some cases the same web site is accessible by different addresses, like domain.com, www.domain.com, www.domain2.com and we want to redirect it to one address.
RewriteEngine On
RewriteCond %{HTTP_HOST} !^www.domain.com$ [NC]
RewriteRule ^(.*)$ http://www.domain.com/$1 [R,L]
EXPLAIN: In this case the requested URL http://domain.com/foo.html would redirected to the URL http://www.domain.com/foo.html.
Redirect domains to other directory
RewriteEngine On
RewriteCond %{HTTP_HOST} ^www.domain.com$
RewriteCond %{REQUEST_URI} !^/HTML2/
RewriteRule ^(.*)$ /HTML2/$1
Redirect visitor by user agent
DESCRIPTION: For important top level pages it is sometimes necesarry to provide pages dependend on the browser. One has to provide a version for the latest Netscape, a version for the latest Internet Explorer, a version for the Lynx or old browsers and a average feature version for all others.
# MS Internet Explorer – Mozilla v4
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} ^Mozilla/4(.*)MSIE
RewriteRule ^index\.html$ /index.IE.html [L]
# Netscape v6.+ – Mozilla v5
RewriteCond %{HTTP_USER_AGENT} ^Mozilla/5(.*)Gecko
RewriteRule ^index\.html$ /index.NS5.html [L]
# Lynx or Mozilla v1/2
RewriteCond %{HTTP_USER_AGENT} ^Lynx/ [OR]
RewriteCond %{HTTP_USER_AGENT} ^Mozilla/[12]
RewriteRule ^index\.html$ /index.20.html [L]
# All other browsers
RewriteRule ^index\.html$ /index.32.html [L]
source : http://www.widexl.com/tutorials/mod_rewrite.html
Welcome to mod_rewrite, voodoo of URL manipulation.
This document describes how one can use Apache’s mod_rewrite to solve typical URL based problems webmasters are usually confronted with in practice. The Apache module mod_rewrite is a module which provides a powerful way to do URL manipulations. With it you can nearly do all types of URL manipulations you ever dreamed about. The price you have to pay is to accept complexity, because mod_rewrite is not easy to understand and use for the beginner.
NOTE: Depending on your server configuration it can be necessary to change the examples for your situation. Always try to understand what it really does before you use it. Bad use would lead to deadloops and will hang the server.
The most example’s can be used in the .htaccess file while other ones only in the Apache htppd.conf file.
RewriteCond
The RewriteCond directive defines a rule condition. Preserve a RewriteRule with one or more RewriteCond directives. The following rewriting rule is only used if its pattern matches the current state of the URI and if these additional conditions apply too.
You can set special flags for condition pattern by appending a third argument to the RewriteCond directive. Flags is a comma-separated list of the following flags:
[NC] (No Case)
This makes the condition pattern case insensitive, no difference between ‘A-Z’ and ‘a-z’.
[OR] (OR next condition)
Used to combinate rule conditions with a OR.
RewriteRule
The RewriteRule directive is the real rewriting.
You can set special flags for condition pattern by appending a third argument to the RewriteCond directive. Flags is a comma-separated list of the following flags:
[R] (force Redirect)
Redirect the URL to a external redirection. Send the HTTP response, 302 (MOVED TEMPORARILY).
[F] (force URL to be Forbidden)
Forces the current URL to be forbidden. Send the HTTP response, 403 (FORBIDDEN).
[G] (force URL to be Gone)
Forces the current URL to be gone. Send the HTTP response, 410 (GONE).
[L] (last rule)
Forces the rewriting processing to stop here and don’t apply any more rewriting rules.
[P] (force proxy)
This flag forces the current URL as a proxy request and put through the proxy module mod_proxy.
Regular expressions
Some hints about the syntax of regular expressions:
Text:
. Any single character
[chars] One of chars
[^chars] None of chars
text1|text2 text1 or text2
Quantifiers:
? 0 or 1 of the preceding text
* 0 or N of the preceding text (N > 0)
+ 1 or N of the preceding text (N > 1)
Grouping:
(text) Grouping of text
Anchors:
^ Start of line anchor
$ End of line anchor
Escaping:
\ char escape that particular char
Condition pattern
There are some special variants of CondPatterns. Instead of real regular expression strings you can also use one of the following:
< Condition (is lower than Condition)
Treats the Condition as a string and compares it to String. True if String is lower than Condition.
> Condition (is greater than Condition)
Treats the Condition as a string and compares it to String. True if String is greater than CondPattern.
= Condition (is equal to Condition)
Treats the Condition as a string and compares it to String. True if String is equal to CondPattern.
-d (is directory)
Treats the String as a pathname and tests if it exists and is a directory.
-f (is regular file)
Treats the String as a pathname and tests if it exists and is a regular file.
-s (is regular file with size)
Treats the String as a pathname and tests if it exists and is a regular file with size greater than zero.
-l (is symbolic link)
Treats the String as a pathname and tests if it exists and is a symbolic link.
-F (is existing file via sub request)
Checks if String is a valid file and accessible via all the server’s currently configured access controls for that path. Use it with care because it decreases your servers performance!
-U (is existing URL via sub request)
Checks if String is a valid URL and accessible via all the server’s currently configured access controls for that path. Use it with care because it decreases your servers performance!
NOTE: You can prefix the pattern string with a ‘!’ character (exclamation mark) to specify a non-matching pattern.
Protecting your images and files from linking
DESCRIPTION: In some cases other webmasters are linking to your download files or using images, hosted on your server as inline-images on their pages.
RewriteEngine On
RewriteCond %{HTTP_REFERER} !^$ [NC]
RewriteCond %{HTTP_REFERER} !^http://domain.com [NC]
RewriteCond %{HTTP_REFERER} !^http://www.domain.com [NC]
RewriteCond %{HTTP_REFERER} !^http://212.204.218.80 [NC]
RewriteRule ^.*$ http://www.domain.com/ [R,L]
EXPLAIN: In this case are the visitors redirect to http://www.domain.com/ if the hyperlink has not arrived from http://domain.com, http://www.domain.com or http://212.204.218.80.
Redirect visitor by domain name
DESCRIPTION: In some cases the same web site is accessible by different addresses, like domain.com, www.domain.com, www.domain2.com and we want to redirect it to one address.
RewriteEngine On
RewriteCond %{HTTP_HOST} !^www.domain.com$ [NC]
RewriteRule ^(.*)$ http://www.domain.com/$1 [R,L]
EXPLAIN: In this case the requested URL http://domain.com/foo.html would redirected to the URL http://www.domain.com/foo.html.
Redirect domains to other directory
RewriteEngine On
RewriteCond %{HTTP_HOST} ^www.domain.com$
RewriteCond %{REQUEST_URI} !^/HTML2/
RewriteRule ^(.*)$ /HTML2/$1
Redirect visitor by user agent
DESCRIPTION: For important top level pages it is sometimes necesarry to provide pages dependend on the browser. One has to provide a version for the latest Netscape, a version for the latest Internet Explorer, a version for the Lynx or old browsers and a average feature version for all others.
# MS Internet Explorer – Mozilla v4
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} ^Mozilla/4(.*)MSIE
RewriteRule ^index\.html$ /index.IE.html [L]
# Netscape v6.+ – Mozilla v5
RewriteCond %{HTTP_USER_AGENT} ^Mozilla/5(.*)Gecko
RewriteRule ^index\.html$ /index.NS5.html [L]
# Lynx or Mozilla v1/2
RewriteCond %{HTTP_USER_AGENT} ^Lynx/ [OR]
RewriteCond %{HTTP_USER_AGENT} ^Mozilla/[12]
RewriteRule ^index\.html$ /index.20.html [L]
# All other browsers
RewriteRule ^index\.html$ /index.32.html [L]
source : http://www.widexl.com/tutorials/mod_rewrite.html
Thursday, January 7, 2010
struts flow
Struts is a open source implementation of MVC design pattern to develop large scale web applications.Struts framework makes it easier to design realible,scalable web applications in java.Struts is not only thread safe but also thread dependent.It instantiates each action once and allows others to be threaded through the original object.Struts reduces the for redundant jsp's.ActionForm stratagy reduces the need of sub class hierarchy.
Struts is a light weight package.It consists of 5 core packages and 5 tag lig directories.
whenever client send request first it goes to CONTROLLER part(ActionServlet)it reads the request data& decides about which action should be performed to process the client request after that it forwards to the appropriate action class(MODEL part)where the functionality code(bussiness logic) will be there, from this MODEL part the control goes back to CONTROLLER part from here the appropriate jsp pages(VIEW part) can be picked up for presenthing(displaying) the result on the clients browser.
Struts is a light weight package.It consists of 5 core packages and 5 tag lig directories.
whenever client send request first it goes to CONTROLLER part(ActionServlet)it reads the request data& decides about which action should be performed to process the client request after that it forwards to the appropriate action class(MODEL part)where the functionality code(bussiness logic) will be there, from this MODEL part the control goes back to CONTROLLER part from here the appropriate jsp pages(VIEW part) can be picked up for presenthing(displaying) the result on the clients browser.
Wednesday, December 23, 2009
Invoke a JSP error page from a servlet
You can invoke the JSP error page and pass the exception object to it from within a
servlet. The trick is to create a request dispatcher for the JSP error page, and pass the
exception object as a javax.servlet.jsp.jspException request attribute. However, note that
you can do this from only within controller servlets.
If your servlet opens an OutputStream or PrintWriter, the JSP engine will throw the
following translation error:
java.lang.IllegalStateException: Cannot forward as OutputStream or Writer has already been
obtained
The following code snippet demonstrates the invocation of a JSP error page from within a
controller servlet:
protected void sendErrorRedirect(HttpServletRequest request,
HttpServletResponse response, String errorPageURL, Throwable e) throws
ServletException, IOException {
request.setAttribute ("javax.servlet.jsp.jspException", e);
getServletConfig().getServletContext().
getRequestDispatcher(errorPageURL).forward(request, response);
}
public void doPost(HttpServletRequest request, HttpServletResponse response)
{
try {
// do something
} catch (Exception ex) {
try {
sendErrorRedirect(request,response,"/jsp/MyErrorPage.jsp",ex);
} catch (Exception e) {
e.printStackTrace();
}
}
}
servlet. The trick is to create a request dispatcher for the JSP error page, and pass the
exception object as a javax.servlet.jsp.jspException request attribute. However, note that
you can do this from only within controller servlets.
If your servlet opens an OutputStream or PrintWriter, the JSP engine will throw the
following translation error:
java.lang.IllegalStateException: Cannot forward as OutputStream or Writer has already been
obtained
The following code snippet demonstrates the invocation of a JSP error page from within a
controller servlet:
protected void sendErrorRedirect(HttpServletRequest request,
HttpServletResponse response, String errorPageURL, Throwable e) throws
ServletException, IOException {
request.setAttribute ("javax.servlet.jsp.jspException", e);
getServletConfig().getServletContext().
getRequestDispatcher(errorPageURL).forward(request, response);
}
public void doPost(HttpServletRequest request, HttpServletResponse response)
{
try {
// do something
} catch (Exception ex) {
try {
sendErrorRedirect(request,response,"/jsp/MyErrorPage.jsp",ex);
} catch (Exception e) {
e.printStackTrace();
}
}
}
Wednesday, November 25, 2009
What is the purpose of tiles-def.xml file, resourcebundle.properties file, validation.xml file?
1. tiles-def.xml
tiles-def.xml is used as a configuration file for an appliction during tiles development
You can define the layout / header / footer / body content for your View.
Eg:
<tiles-definitions>
<definition name="siteLayoutDef" path="/layout/thbiSiteLayout.jsp">
<put name="title" value="Title of the page" />
<put name="header" value="/include/thbiheader.jsp" />
<put name="footer" value="/include/thbifooter.jsp" />
<put name="content" type="string">
Content goes here
</put>
</definition>
</tiles-definitions>
<tiles-definitions>
<definition name="userlogin" extends="siteLayoutDef">
<put name="content" value="/dir/login.jsp" />
</definition>
</tiles-definitions>
2. validation.xml
The validation.xml file is used to declare sets of validations that should be applied to Form Beans.
Each Form Bean you want to validate has its own definition in this file
Inside that definition, you specify the validations you want to apply to the Form Bean's fields.
Eg:
<form-validation>
<formset>
<form name="logonForm">
<field property="username"
depends="required">
<arg0 key=" prompt.username"/>
</field>
<field property="password"
depends="required">
<arg0 key="prompt.password"/>
</field>
</form>
</formset>
</form-validation>
3. Resourcebundle.properties
Instead of having hard-coded error messages in the framework, Struts Validator allows you to specify a key to a message in the ApplicationResources.properties (or resourcebundle.properties) file that should be returned if a validation fails.
Eg:
In ApplicationResources.properties
errors.registrationForm.name={0} Is an invalid name.
In the registrationForm.jsp
<html:messages id="messages" property="name">
<font color="red">
<bean:write name="messages" />
</html:messages>
Output(in red color) : abc Is an invalid name
tiles-def.xml is used as a configuration file for an appliction during tiles development
You can define the layout / header / footer / body content for your View.
Eg:
<tiles-definitions>
<definition name="siteLayoutDef" path="/layout/thbiSiteLayout.jsp">
<put name="title" value="Title of the page" />
<put name="header" value="/include/thbiheader.jsp" />
<put name="footer" value="/include/thbifooter.jsp" />
<put name="content" type="string">
Content goes here
</put>
</definition>
</tiles-definitions>
<tiles-definitions>
<definition name="userlogin" extends="siteLayoutDef">
<put name="content" value="/dir/login.jsp" />
</definition>
</tiles-definitions>
2. validation.xml
The validation.xml file is used to declare sets of validations that should be applied to Form Beans.
Each Form Bean you want to validate has its own definition in this file
Inside that definition, you specify the validations you want to apply to the Form Bean's fields.
Eg:
<form-validation>
<formset>
<form name="logonForm">
<field property="username"
depends="required">
<arg0 key=" prompt.username"/>
</field>
<field property="password"
depends="required">
<arg0 key="prompt.password"/>
</field>
</form>
</formset>
</form-validation>
3. Resourcebundle.properties
Instead of having hard-coded error messages in the framework, Struts Validator allows you to specify a key to a message in the ApplicationResources.properties (or resourcebundle.properties) file that should be returned if a validation fails.
Eg:
In ApplicationResources.properties
errors.registrationForm.name={0} Is an invalid name.
In the registrationForm.jsp
<html:messages id="messages" property="name">
<font color="red">
<bean:write name="messages" />
</html:messages>
Output(in red color) : abc Is an invalid name
Subscribe to:
Posts (Atom)