5 Immediate First SEO Settings To Be Applied To Get Quick Results

To be successful online, you need to work hard and moreover, ‘smart’ If you are not smart online, someone else would take the cup. So, basically blogging is a crowded mall with a lots of shops representing blogs and you have to attract the visitors i.e. the customers moving around in the mall. In this post, I’ll be covering very basic but very important SEO settings which you should immediately apply if you have a new blog or are going to restart your existing blog.

First SEO Settings To Be Applied To Get Quick Results

Some Points To Keep In Mind:

#1: Meta Tags

Lets begin with some things who everyone knows. Meta tags are certain tag scripts which help Search Engines to recognize many fields of your site like title, description, keywords (not considered), etc. There are some OG (Open Graph) meta tags used by facebook to crawl the content with accuracy. The very basic Meta Tags you should apply immediately to your blog are:

<!– SEO –>

<meta charset=”UTF-8″>

<meta name=”description” content=”Description”>

<meta name=”keywords” content=”Keyword”>

<!– SEO –>

The last 2 defines your description and keywords. Generally they are placed in </head> tag but you can place it anywhere you want. For UTF-8, you can read what it is here.

#2: Sitemap Creation

Obviously that you made a new site, you cannot submit a blank sitemap to Google. The first and the fast task you need to do is to create your official sitemap which you have to submit to Google later on when you have decent number of posts. On wordpress, certain plugins make it easier to create your own sitemap like Google XML Sitemap and some platforms like blogger provide your automatic sitemap generation service.

#3: Google Verified Authorship

Time to attract your visitors to your website. Make your search results stand apart from others and verify your authorship in Google+ to show up a small image of yourself besides your search results. This is a really easy task. Just you have to do is to add 2 [ <link rel=”author” href=”profile”/> and <link rel=”publisher” href=”same profile”/> ] to your header file or wherever you want. But that file should load on every post and page. Header file is recommended.

#4: Buying Adwords Credits

If you are looking for quick customers, adwords is the best place for you. Unlike adsense, adwords allows you to buy advertisement space on Google Search results and gain quick and healthy human customers fast. Adwords is cheap, effective, and really works. Some of my friends running a full time business started their business with adwords only.

#5: SEO Friendly Framework

Make sure you don’t mess up your coding stuff and internal framework in your website. Since we mainly write for readers and not for robots, we always forget how and why it can affect our SEO settings and rankings. A bad structured framework site can confuse google bots and can drop your positions in search results. Always take care of this.

From Editor’s Desk

Quick 5 startup points that are surely going to give you a boost up start for your website. If you have some more awaiting to munch, we’re here to post ‘em here too. Let ‘em coming!

Guide to Finding the Best PHP Hosting Deal

This article will provide some tips that are both general and practical to help you locate PHP hosting deals on a budget.

Before diving into the tips, it is worth reflecting a bit on the state of the existing hosting industry.

Dedicating hosting services, which are also known as managed hosting services or dedicated servers, are types of internet hosting where clients who lease internet space can lease complete servers that are not shared with other individuals.

This is different from shared hosting, where individual servers are shared across multiple clients; dedicated hosting is more flexible since an individual client can have complete control over the server or servers he or she uses; this includes the ability to choose hardware and operating systems.

However, the tradeoff of using dedicated hosting services rather than shared ones is that dedicated servers are usually going to be more expensive.

PHP refers to a dynamic scripting language that more than 20 million websites all over the world use, from the biggest social networks, including Facebook, to the smallest blog sites.

There are a few common factors you will come to discover when searching for PHP hosting deals; these are found throughout the existing hosting industry. For example, you will generally spend more money the more bandwidth you are looking to use on a monthly basis through your server.

Similarly, the more server space you are asking for, the more money you will pay for the privilege. You will also find a mixture of Linux and Windows hosting systems, and ardent supporters of both. Security will be another area that different hosts advertise, as will the levels of support they offer and the amount of uptime they promise per month.

When searching for the most cost effective PHP hosting deals, a good place to start is with Linux based hosts. Because Linux operating systems tend to be free, web hosts tend to charge less for servers based on Linux (e.g., RedHat, Mandriva, Ubuntu, Slackware) than they do for servers based on Windows (e.g., 7, Vista, XP, NT, 2000). You are also likely to find more secure servers when using systems based on Linux or Unix operating systems due to the designs of the operating systems, but generally, regardless of which operating system you choose, measures for security will be in place.

Another way to potentially reduce costs is to sign a longer contract at a time with a web host. The longer the amount of time you promise to send a company money, the lower the monthly rate they are likely to charge you. However, be careful not to sign a contract containing too many months, as you might want to switch to a different host later, and you won’t want to be trapped into one for years to come.

A final tip is to sign up for only as much bandwidth as you anticipate needing by the end of your lease. Bandwidth means money, and the less you use, the less you’ll spend.

Shortcomings of Duplicate Content

Duplicate content is content that appears on the Internet word by word on more than one different URL. The duplicate content is also termed when the same site contains multiple pages with largely identical content.

Besides the word ‘duplication’ is also sometimes used when the search engines get confused between pages with same description when displaying search results.

Different popular search engines always discouraged the duplicate content because they want to give internet users unique results which will become food for their thoughts while eliminating same content available on different websites from their result pages.

Types of Duplicate Content

The issue of duplicate content is circulating among many internet users from quite some time which has created disturbance among many online peers. So it is a real dire need to settle down the dust of disturbance.

Basically there are many shortcomings of a duplicate content but here we will first dissect the types of duplicate content and then discuss the shortcomings of it.

One type is deceptive or malicious duplicate content and the other kind is non-deceptive or non-malicious duplicate content.

Shortcomings of Deceptive Duplicate Content

Deceptive duplicate content is categorized when the content is intentionally duplicated on various websites to manipulate the ranking categorized by search engines or to improve the page ranking of a particular site.

The shortcomings of this type of duplicate content and deceptive technique will ultimately cause the page ranking of the site to suffer or in severe cases the site might be entirely removed from the index of search engine.

Search engines like Google, Yahoo, Bing, etc absolutely discourage and hate the duplicated content. These search engines, mainly Google is totally aware of this technique and have implemented back end crawling tactics to dig out the sites involved in duplication of content.

Shortcomings of Non-Deceptive Duplicate Content

Non-deceptive duplicate content is considered by search engines when the same content is placed more then once in same site under various URLs.

For instance if any one of you have clothing business and the product red trouser is your newly added product or it’s the main item on your website then for sure you will place the same content about red trouser on your different pages. Google will instantly consider it as a duplicate content.

Even if you write copyright policy of 300 words at the end of every page then again it will be considered as a duplicate content because Google will see it as keywords stuffing on every single page. There are also some shortcomings of this type of duplicate content. When Google finds same content on same site it only choose one page of that site and discard the rest.

How to avoid Duplicate Content and its Shortcomings

Now as a matter of fact now there are many ways to avoid non-deceptive duplication of content to outrun from its shortcomings. You can block content of some pages of your website with ‘noindex’ meta tag or adding ‘robots.txt’ meta tag in the head section to stop the spiders of search engine to index that page.

But the best way to become Google friendly website is to use canonicalization. Canonicalization is used when there is more then one identical content on your website and you want to stop Google to consider it duplicate content then you pick a canonical URL (preferred URL) to consider for indexing.

On and off you should also check the content of your website for any replication. If you found yourself a victim and your content is theft and duplicated then you can report the matter to Google under their DMCA guidelines.

Besides this, there are also some disadvantages of Google’s policy of removing duplicated content which is hampering the page rank of many sites. But this topic is a subject of totally another topic which will be discussed afterwards.

Search Engine Optimization Tutorial

Free search engine optimization tutorial: material revolving around potential Search Engine optimization; discussed are spidering, spam, submissions, meta tags, and other relevant information.

Do not SPAM search engines

SPAMMING search engines – What spamming the search engines is and how to avoid doing it.

What is SPAM?

Spam is most closely associated with sending unwanted, unsolicited emails to unwitting recipients, but it is also considered SPAM to post unwanted ad’s to people on message boards, in chat rooms, and on news groups.

Search engine SPAM is when search engines consider you to be flooding their data bases by submitting too many pages too often or by submitting the same pages again and again. Other forms of SPAM include pages that contain meta tags that have nothing to do with the actual page, and submitting the same exact page from different addresses.

What is search engine spidering?

What is Spidering?- Search engine spidering is briefly examined and considered.

A search engine robot is called a “spider.” The reason being that when the robot/spider goes to a page, it follows all the links on that page. In the imagination this action resembles a many legged spider. The robot follows the links to see if the pages go where they say they go. To know what these pages are the spider reads their meta tags. Robots normally always follow down at least one level when they index a page, and many spiders return later to do a “deep crawl” and index every page they find. For this reason if you submit one page of a site to a search engine that crawls, eventually every single page will be indexed. If you ever want a page not to be indexed you can add a line in the HEAD tags (placed between arrows, of course) like this: <META NAME=”robots” CONTENT=”noindex,nofollow”> – you can also use variations: index,nofollow or noindex,follow – depending on what you want the spider to do. The default for no tag is that the robot will index and follow the page. A”robots.txt” file can also be used for a domain, a subject to broad to go into here, and you don’t need to know about it if you have a free site since you can’t use it anyway (it only works for domains).

How many pages should I submit to search engines and how often?

Tips and information related to submitting web pages to search engines.

Never ever submit the same page twice in a 24-hour period; in fact, it is best not to submit any single page more often that every 30 days, and ideally a page doesn’t really need to be submitted more often than every 90 days. Search engines that “crawl” will eventually index every page of your site (so you don’t need to submit every page). This process takes anywhere from 2-6 months. For this reason you should only submit the MOST IMPORTANT pages of your site. Don’t submit more than two different pages in a 24-hour period; and it is preferable to only submit one page at a time allowing 2-3 days to pass between submissions. This approach offers you the benefit of spreading out your spidering. You need to remember that when you submit a page all the pages it is linked to will also be indexed again. If you have 4 main pages, you can submit 1 a week and eventually all of your pages will be indexed once a week. Pages are also ranked higher when they are crawled rather than submitted, so you want to leave some “gem” pages for the spider to find. If you have more than one site, be sure they are linked together so that when one gets crawled they all do.

What are META TAGS?

Optimize META TAGS? – A simple definition of meta tags and how search engine robots use them.

Meta tags are tags that appear in HTML at the top of each web page; these tags tell the spiders what’s on the page in a fashion that is easy for them to index. The robot reads these tags and compares them to the content on the page. The better the tags add up with the actual content of the page the higher the spider will rank the page’s TITLE, DESCRIPTION, and KEYWORDS. When the spider finds material in the tags that does not reflect the actual content of the page, it will give the page a low ranking or will not index the page at all.


More detailed insight into meta tags touching on their three major elements.

To see meta tags a visitor can while viewing most web pages, select “view” on their browser tool bar, then select “source,” where they will be able to look at all the HTML of that page unless it is protected or hidden by a frame. The most important part of the meta tags are the Title, Description, and Keyword tags. However, the HEAD /HEAD tags are important because they tell the spider that it is the area where the tags exist. Also, there must be a BODY tag after the /HEAD tag or the spider can not read the tags at all.

The TITLE Tag – TITLE your title goes between these two tags /TITLE

Optimize TITLE Tag – The title tag is examined and considered with suggestions for their proper use.

The Title tag should be between 4 and 70 characters long, or thereabout, and preferably over 30. If the title isn’t descriptive enough it may not receive a good ranking from the robots. Be sure to include keywords you want to rank well for in the title, because it is one of the most important ranking components of the page. “Mandy’s Playhouse” may be a cool site name, but it’s unlikely that anyone is searching the engines for those particular words. It’s better to place in the title tag what is available at “Mandy’s Playhouse;” for instance: “Games, Music, Chat, and Fun” – do not repeat words here or the engines will penalize your ranking for the repeated words. Then, whatever your title says is what you want it to say at the top of your visible page, preferably in H /H tags. This way the spider can read the title in the meta tag, compare it to the page, see that it adds up, and will rank the page better.


Optimize DESCRIPTION Tag – The “title” tag is examined and considered with suggestions for their proper use.

META NAME=”description” CONTENT=”your description goes here”

This tag contains a short blurb describing the page contents. Once again it is vital to get in some keywords. Remember at the search engine level you are first talking to a robot, and secondly, only later (after the page is indexed) to humans. You need to try to create descriptions that will appeal to a spider ranking you in the first place and that will appeal to a reader potentially clicking through in the second place. There is some dispute between various sources about what length a Description tag should be, but apparently bettwen 150 and 160 characters works best. Shorter (110 characters) is probably better. Get your keywords in but do not repeat them and do not repeat the “Title” tag in your Description tag (it’s a waste of quality words and will hurt your ranking). The description written in your Description tag ought to appear in some variation on your page, the closer to the top of the page, the better. Once again, the spider will compare the tag to the content and see that they add up.


Optimize KEYWORDS Tag – The “keywords” tag is examined and considered with suggestions for their proper use.

META NAME=”keywords” CONTENT=”your keywords go here”

Robots are beginning to ignore this tag because so many people add words here that do not relate even remotely to their page. A form of SPAM is repeat keywords here again and again in hopes of ranking high for them; but the spiders are onto this and pages with words overly repeated in this tag are penalized by being ranked low or being disqaulifed altogether and not being indexed. Various sources have various angles that dispute one another on the subject of how many times a keyword can be repeated. Some say a keyword should be repeated 7 times, some say no more than 3 times, and some say no word should not be repeated at all (meaning place the word in the tag once and once only). In light of this controversy I suggest not normally using a word more than once unless it is unavoidable. Keywords work the same as words in the Title and Description tags. 90% (better yet 100%) of the keywords listing in your tag should appear on your page, and not in a “list” and not overly repeated on the page; they should be used relevantly in real sentences that actually matter to the overall content of the page. Characters in the Keyword tag should not exceed 800 or so and less is better. Many spiders ony read the first 7 keywords anyhow and ignore the rest; for this reason place your most important keywords first. Do not use commas; they are not needed and they separate words that could be better used in “key phrases.” Key phrases are two or more words used together that people might potentially use in a search. For instance someone might search for “new cars,” instead of just “cars.” If you use commas in your keyword tag then you’ll have to repeat words to get phrases: “new” , “new cars” , and “cars.” Without the commas you simply list “new” before “cars” and it becomes all three potential search inputs. Do not include a lower case and upper case version of keywords; most spiders are not case sensitive, and most searches are conducted in small letters, even when it’s a proper name.

Imbedded Keywords

Optimize Imbedded Keywords – Imbedded keywords are explained.

Imbedded keywords are keywords that appear in the addresses of your pages, including the URL itself. An example of an imbedded URL is mp3.com. This domain name provides the imbedded keyword “mp3,” which mp3.com then ranks higher for. This is why it is vital to properly name your site, your pages, and your page addresses. When a robot follows a link it reads the text of the link on the page it’s following, it reads the address of the page it’s going to, and reads that page. If everything adds up the robot ranks the page well. Imbedded keywords in the name of your site are not all that is important. The addresses of your pages should reflect the content of those pages. Let us consider a hypothetical web site; we’ll call it flowers.at57.com. Automatically the site has a better ranking for the keyword “flowers.” Now the links on the page might be “roses” , “tulips” , and “marigolds.” Here’s what the optimum addresses should look like: flowers.at57.com/roses.html, flowers.at57.com/tulips.html, and flowers.at57.com/marigolds.html. Carefully consider all your addresses before creating them, and do not abbreviate – abbreviations are meaningless to a robot.

!– Comment Tags —

Optimize Comment Tags – A note about comment tags and how to use them.

Comment tags are tags that can be added to the HEAD section with meta tags. Go to “view” on your browser tool bar and select “source” to read the comment tags I’ve added to this page. The proper format (and I’ve noticed that a lot of people blow this) is a left arrow, an exclamation point, two hyphens, a space, then text, like this: !– text goes here. It closes after the text with two more hyphens and a right arrow. Inside the arrows you can write comments telling what your site’s about; be sure to include keywords. How effective this is, is hard to say. From what I understand the only engine that definitely reads this tag is Hotbot; however, it is likely that the presence of this text may help to reinforce the validity of your tags and page for other robots. Do not overdo it with comment tags. These tags can also be used in the BODY of your page to point out what exists there when a spider can’t read it, like a java search box or script. doing this also helps you to locate whatever it is follows the tag in the HTML (don’t overdo it with these either).

Hidden Tags

Optimize Hidden Tags – A note about hidden tags and warnings about invisible text.

input type=”hidden” value=”put keywords here”

It is best to use as little “hidden” data as possible and never ever use “invisible text” (text that doesn’t show up because it is the same color as the background of the page – engines react harshly to this SPAM tactic). At the top of the page directly following the BODY tag you can place (inside arrows) input type=”hidden” value=”put keywords here” – this may increase the validity of some of your keywords, but it messes up your page design by leaving a large gap at the top of the page where the hidden text appears; and it’s questionable how effective it even is. I’ve included it here to help make you aware of it, but I don’t recommend it, and some engines may even penalize you for it.

“ALT” Tags and Gif Names

Optimize “ALT” Tags and Gif Names – The benefits of using “alt” tags with gifs are considered, also contains notes on naming gifs.

ALT=”describe image here”

Some spiders read “alt” tags. These are the tags that tell what a .gif or .jpg is – without the “alt” tag a robot crawling your page has no clue what the image is; so, you explain to the spider what the image is by creating this tag. It can be placed in various places in the HTML code, between “IMG” and “SRC” seems to work well for me. It’s good to always add BORDER=”0″ if it’s an HREF or else an unsightly border will surround everything inside the anchor tag (A /A). This too helps increase the validity of your page. Also when you name gifs and jpegs, name them what they are, examples: roses.gif, tulips.gif, marigolds.gif. This potentially increases the validity of your page because the textual words recurr, and frankly, it makes managing your files much easier.

What is Link Popularity?

What is link popularity and why is link popularity so important? This page gives you the overview of link popularity, its necessity and benefits.

Just how important is link popularity? Let me tell you a story. I had an mp3 search engine site that didn’t even rank in the top 200 at Google. I pointed some links at it, and lo and behold, I jumped up to number 90. I started some other sites and pointed more links at it and jumped to number 30. More links and I was number 15. I’ve added even more links lately and I’m now number 7; and it’s all because of link popularity. It doesn’t just work with Google, it works with all true crawlers (Lycos, AllTheWeb, etc.): Basically, the same thing happened at AltaVista; I wasn’t even in the top 200 at first, now after increasing my link popularity, I’m #5 there.

Link popularity relates to the number of links pointed to a web site. Over the years apparently the number of links pointing to a site has become of more prominent importance in terms of search engine positioning, ie: websites with lots of sites linking to them rank better in searches. Some writers on the subject even claim that search engines take into account the “relevance” of the links, ie: sites linked to by similar sites rate better. Search engines are secretive about their exact practices and it’s hard to tell what’s true. One thing is certain, though link poularity is touted as a newly emergent element on the scene, it actually has been significant since the beginning because of the “crawl” factor. Regardless of whether or not most search engine software is sophisticated enough to actually meter link popularity as a crucial factor, and especially whether or not it is sophisticated enough to guage themed links, the more links a site has pointing to it, the more often it is crawled by spiders, and the more often it is indexed. Therefore, it is crucial for a site to have at least a few external links pointing to. This is more true now that some engines are even dropping sites that don’t have at least one external link, and this could possibly become a trend with search engines to help conserve database space.

If you have more than one site, be sure they are pointed at each other to increase your link popularity. In fact you want to get a link everywhere you can: directories, classifieds, link exchanges, etc. Note that a link from a search engine like Alta Vista or an index like Yahoo don’t count as link-backs because they are stored in a cgi-bin database, which is disallowed in the robots.txt. The best place to get link-backs are on directories that publish pages in the public areas of their site. When you go to a directory go down a level to look at the site listings and note the full URL address. Does the last part say, “cgi-bin/,” or does it say something like, “home/internet/submissions.html”? The former is in a potentially disallowed cgi-bin. The latter is in the public docs area, and that’s where you want your links. I’ve set up a link exchange directory with just this in mind.

20 Reasons Why I Choose WordPress

Please allow me to share with you my 20 reasons why I choose WordPress over the other readily-available blogging platforms.

Yes, I love my WordPress and will continue to use it. It has been – and will always be – my primary choice of blogging platform simply for its ease of use. If you are still on Blogspot, this article might just give you a bit of food for thought. Continue reading

Is Your Blog Boring? Here’s How To Wake It Up!

“If you build it, it will come,” is an outdated and fanciful blogging adage. Influence doesn’t magically materialize through blogging alone. You need to honestly figure out why your blog isn’t as dynamic as it should be so you can make it work better for you.

Before you raise your fist to the sky and declare, “MY BLOG SUCKS!” take a step back and reevaluate what’s wrong with it, and how you can fix it. This is a more common occurrence than you think, and plenty of successful bloggers have encountered blog stasis at least once in their blogging careers.

Here are common reasons as to why nobody is interested in your blog, and how to wake it up.

  1. Poor design

If your blog isn’t as easy to navigate, or if your design and layout elements clash, it won’t attract an audience. People like browsing through blogs that make them feel good, and it won’t make them feel good if they think your blog is ugly.


  • Choose a clean template with easy-to-read fonts.
  • Use a color harmonious palette that doesn’t overpower the other elements of your layout.
  • Don’t incorporate too much ads, especially the animated ones.
  • Avoid videos set to autoplay.
  • Before relaunching your blog, test all links and buttons first to see if it’s easy to navigate.
  1. Oversaturated niche

Some niches, such as food, fashion, and make-up reviews are already highly saturated that it would be hard for you to set yourself apart from the more established blogs. And readers are likely to gravitate toward the familiar and established.


  • Find a unique spin on your subject that you can claim as your own.
  • Or go back to square one, and find other niches that you would like to write about.
  1. Uninteresting or exaggerated titles

Cliched titles such as My Ramblings On…, or hyperbolic, cheap landing-page titles like How Linda Lost 380lbs And Now You Can Too!!! won’t capture your audience’s attention. The first is too pedestrian, the second is overkill.


  • Make sure your title is clear and informative but subtle enough.
  • Use a catchy writing style — a rhyme, alliteration, or humor.
  • Read up on popular established blogs in your niche to see how their post titles are written.

Guess what, so are your readers.

  1. Off-the-mark content

Your posts may be long-winded and pointless, or they may lack information that your audience needs. Or they may be rehashed material from other blogs in the same niche. Nobody likes to read blog posts that offer little to no value.


  • Write original content.
  • Offer relevant AND unique information to surprise your readers.
  • Do your research well.
  • Make sure your topic is something you are well-informed about.
  1. Poorly delivered content

Your topics are interesting enough, but the way you speak to your audience turns them off. Are you too serious, condescending, patronizing, hostile, or phony?


  • Match your writing tone to your niche, but keep it friendly enough for readers.
  • If you’re writing about serious topics, keep the tone neutral and informative. Too much negative emotion will turn your readers off.
  • If you’re selling ideas, be careful not to oversell or patronize your readers.
  • Never talk down to your readers, regardless of their age range and life station.
  1. Self-indulgence

Sure, your blog is all about you and your brand. But if you keep trumpeting about your product’s great qualities and accomplishments, or if you share endless tales about yourself, your thoughts, activities, and opinions, it can get boring for your readers.


  • Make your stories relatable to your readers. Use shared interests as a focal point for discussion.
  • If you run a business blog, tell your readers how they can benefit from you and your products, instead of just how good you are.
  • Place a flexible limit on your post length.
  • Keep your posts concise.

Ask your readers questions relevant to your post.

  1. Lack of engagement

Your blog is a platform for you to be heard. But for your readers, it’s also a venue for them to be heard. Readers fall off the grid when they feel that the blogger they’re responding to isn’t capable of listening to them.


  • Respond to your readers by answering some of their comments.
  • Find comments that are potential conversation starters, and reply to them.
  • Encourage your readers to ask you questions, and answer them.
  • Add follow-up questions to your reader responses in your blog comments, or in your social media sites.
  1. Lack of promotion

Your blog won’t take off if you don’t sell it. It needs the help of your social media engagement to spread the word about its existence.


  • Market your blog in popular social media sites, as well as networking sites related to your niche. Be pro-active.
  • Once you’ve built a social media following, regularly and constantly engage them with conversations and updates.
  • Use email marketing to keep your readers updated with your blog posts.

Have you encountered this kind of issues with blogging? What did you do to address it? Let me know in the comments below!

Making The Most Of Your Photoshop Scratch Disk

You can never have enough RAM for Photoshop and when that runs out it falls back on your hard drive to store data in. This is much slower than RAM – but there are ways to make this work as efficiently as possible.

Most peoples computers only have 1 hard drive installed, but Photoshop does not work efficiently with this set-up. This is because your operating system is also using this drive to store temporary information, Photoshop is pulling data from this drive and a whole host of other processes and application you have running in the background are also wrestling for access – obviously it doesn’t take a techie geek to work out that all this going on behind the scenes will not do Photoshop any favours when it’s trying to shuffle hundreds or even thousands of megs of data in and out of the drive!

The first step is to buy yourself a new hard drive – and buy the fastest you can afford. The factors to consider are rotation speed (7200rpm or faster), cache (8 meg or larger) and connection (SATA or ATA for internal, Firmware or USB2 for external – USB1 IS VERY SLOW). Size is not so much an issue for price these days, I recently bought a 200gig 7200rpm 16meg cache internal drive for less than the price of a good night out with the lady! I have got this partitioned into 3, the first partition for scratch disk, the second for data to be backed up and the third for my tilted of design elements.

Firstly – a word on partitions. It is a common misconception that if you have a single drive partitioned into 2, the first for system and the second for scratch disk you are getting the most out your system – THIS IS WRONG! The drive still only can shift a certain amount of data through the connection and this can actually reduce performance as the drive has to whip around all over the disk!

It is, however, a good idea to keep a completely clean partition on a separate hard drive for scratch disk. This makes it easier for Photoshop to find big, unfragmented areas of the drive to use as a scratch disk. For this reason it is a good idea to format this partition every few months or so to keep it in tip-top condition. Also, if you go for the kind of configuration I have outlined above DON’T be tempted to use this to store files you are working on, just keep it as a dedicated scratch disk and Photoshop will be happy! Finally, use the first partition, I have been told this will use the inner sectors of the disk which are more efficient than the outer sectors.

If you can afford it, a stripped RAID array will give the ultimate Photoshop scratch disk performance. This is basically a process where a computer sees a number of hard drives as one big drive – as your data is read and written it is split across all the disks giving you massive throughput of data – instead of the data all being forced down one cable to a single drive. This is a popular configuration for video editors for this exact reason. It can be expensive, but you will get much higher performance investing in 4 smaller, cheaper disks in a RAID configuration rather than 1 large disk. Mac OSX has software RAID configuration and control built in through disk utility, but Windows XP owners will have to invest in a third-party utility or hardware card to set-up a RAID configuration.