11 Easy SEO wins for your Dermatology Practice Website

You have already spent countless hours studying to become a dermatologist, you shouldn’t have to spend the rest of your life studying to become an SEO expert. Here are 11 easy SEO wins that you can implement today to get your dermatology practice off of the ground:

  1. Make sure that your business has consistent listings across all major citation sources.

First, take a look at the way your business is represented on your own website. Inspect the Name of the business, the address, and the phone number. Double check to make sure that this is the official business name, number and address.

Second, check the three major citation sources: Google, Facebook, and Yelp. Most other citation sources aggregate information from these three sources, so if your business info is incorrect on any of these, dozens of other sites will publish the same incorrect information.

Lastly, check any other sites that you publish your business info to, such as social media like twitter, Pinterest, Instagram, etc.

Having all of your business information consistent across all of these sources will make it easier for potential patients to find you. Also, it will increase your chances of ranking well in Google’s local map pack, which will have huge effects on lead generation. Ranking well in the local map pack leads to increased traffic from mobile, voice search, and desktop traffic.

  • Check to see if your practice website is indexed by Google

Being “indexed” simply means that your site shows up somewhere on the search engine results pages. It doesn’t matter if your site is on page 1 or page 100, being indexed is a milestone and means that potential patients can find you

To see if your site is indexed, go to Google and perform this search: “site:yourwebsite.com”. If your practice website comes up, then congratulations. If not, there may be a few reasons for this:

  1. Your site is new, and Google’s crawlers have not made their way to your site yet. If this is the case, do not worry. It can take up to a few weeks for your site to get indexed by the major search engines, so just be patient and indexation will come.
  2. The robots.txt file on your site is blocking Google’s crawler bots from seeing your site. To check this, go to www.yourdomain.com/robots.txt. On this webpage, you should see pretty blank page with only a few lines of text. You should see a line of text saying “User-agent: *” This is identifying Google’s crawler bots. The next line will most likely say something along the lines of “Disallow: /xxx”. This is telling Google’s bots to not crawl those specific pages. If this page only says “Disallow: /” that means it is not allowing Google to crawl the entire site. Fix this and your problem should go away.
  3. Your website has a manual penalty against it. If you have been doing things that go against Google’s policy, then you can find yourself in big trouble. They can de-index you for violating their policies so make sure you are following their best practices. Be sure to choose an ethical agency if you decide to pay for SEO services. Unfortunately, many agencies these days use “black hat” practices to get your site to rank faster, even though it might put your site at risk for de-indexation.

To check if your site has a manual penalty against it, go to the Google search console, and you should see a message there. Fix any issues with your site and submit your site to get the penalty lifted.

  1. Create/update your robots.txt file

Since we just referenced this in our last point, we might as well tell you how to fully optimize your robots.txt file. Essentially, the robots.txt file on your website tells robots (bots) how to behave when they visit your site. There are many different types of bots, but the most important type to look out for are the Google bots.

The first step is to make sure that you even have a robots.txt file. Go to “www.yoursite.com/robots.txt” and the file should be found there. If there is not one, add it in. If there is, check to make sure it is not blocking the bots from crawling any pages that you want indexed by the search engines.

Second, make sure that the file references your sitemap. The webpage should include something along the lines of “Sitemap: www.yourdomain.com/sitemap.xml”. This is giving the bots access to all of the different pages on your site, which will give them a greater chance of being indexed.

There are many different commands that you could add into this file, but most of them go beyond the scope of this article, so just make sure that the file is there, and isn’t causing your site any harm, that it references your sitemap, and move on.

  1. Update your Sitemap and submit it to Google.

A sitemap is exactly what is sounds like – a map of your website. This is a file that lists the URL of every page of your website. This file is very useful for search engine crawlers, as it lists your whole site in one spot.

First, verify that you have a sitemap by going to “www.yoursite.com/sitemap”  or “www.yoursite.com/sitemap.xml” . If this page is here and it has a list of your webpages, then you are in good shape. Most content management systems, like WordPress, will automatically create one for you. If not, create one and upload it to your site.

Next, you need to submit the sitemap to Google. Go to your websites Google search console dashboard, and in there you should find a place to submit your sitemap. Once you do, hit the test button to make sure the right version was uploaded and there are no problems. After this, it is only a matter of time until most of your webpages are indexed.

  1. Check to make sure your site has enough content

Content should be at the forefront of your SEO strategy. Google’s core strategy is to return results that will satisfy the searcher’s intent. For example, if a person types into a search engine, “How to remove a mole at home?”, they hope to find an article telling them how they should do that. If they do not find what they need, they will quickly leave, and click on another site. Leaving quickly from a site is called “bouncing”, and sites with a high bounce rate do not rank as well as sites with a low bounce rate.

So how do we satisfy the searcher’s intent? Well first, you need to have the right content on your site. We will take a closer look at keyword targeting in a later point, but for now, we should focus on your ideal site visitor. Who is your ideal site visitor? For a dermatology practice, the ideal visitor is a potential patient that is looking for a practice, and is at your site to find out more information. Here are a few things that they may want to know:

  • Hours of operation
  • Services/procedures
  • Insurance information
  • Pricing
  • Address
  • Phone number

You should always include this basic information on your site, and have it in an easy to find spot, so that no potential customer will ever leave your site and go to a competitor because they could not find what they wanted.

One last thing to note about content is that having little content or text on a webpage can be a negative quality signal to Google. A good benchmark to aim for is 1,000 words for a homepage, and 500 words on every other page of the site. These are not hard rules, but it is probably the minimum that you would need to have a useful site that satisfies a potential searchers intent.

  1. Check to make sure your site does not have duplicate content

While talking about content, we may as well talk about duplicate content. Duplicate content, or having the same information on different webpages is a major negative quality signal to the search engines and could result in decreased rankings in the search engine results pages. You don’t want this, so it is important to make sure that your site has useful, quality content.

There are two different types of duplicate content: internal duplicate content, and external duplicate content. Both of these are no good. Duplicate internal content occurs when two or more webpages on your site have the same content. Duplicate external content is having the same content on your site that appears somewhere else on the web.

There could be many reasons for having internal and external duplicate content. A good example of internal duplicate content is if you have your practices name, number, address, and other info in the header or footer of your site, which duplicates itself across all pages of your site. An example of when external duplicate content may occur is if you have a description of a procedure that you offer on your site that you copied word for word from an authoritative source.

At this point, you know that duplicate content is not optimal. So, how do we find this duplicate content? For internal duplicate content, use the site “siteliner.com”. All you have to do is type in your domain name here and it will run a test and show you what percentage of your sites content is similar to other content on your site. To check for external duplicate content, use “copyscape.com”. This site does the same thing as siteliner, except it compares your site to other sites on the web.

  1. Mobile Friendly test

You may be wondering, “why should I worry about being mobile friendly?”. Well, for starters, over half of all search traffic comes from mobile. As well, the percentage of search coming from mobile is increasing every year, meaning that you need to start focusing on how to appeal to mobile searchers. In fact, you should take a “mobile first” strategy and worry less about desktop traffic.

Mobile search is any traffic coming from non-desktop sources. This includes smart phones, iPads, and voice search. To be mobile friendly, you need to make sure that your website supports mobile, loads fast on mobile, and is responsive. The easiest way to check if your site is mobile friendly is to use Google’s own mobile friendly test. Simply go to “search.google.com/test/mobile-friendly” and enter the URL you want to test.

If you find out that your site is not mobile friendly, then fixing it is usually pretty easy. It is pretty rare for a site on a modern content management system to not be mobile friendly, but if it is, make sure you speed up your site (which we will talk about in the next point), install plugins for mobile friendliness, or update your code to be responsive.

  1. Speed up your site

Website speed is a huge factor in SEO. In fact, for every second that it takes your site to load, you lose an estimated 7% of traffic. That can add up to be a huge cost when considering you could have thousands of potential patients visiting your site every month. Fortunately, speeding up a site is one of the easier SEO tasks to complete.

The first step to increasing site speed is to check and see how long it takes to load your site. There are many different tools for this. The top three in our opinion are: Google pagespeed (developers.google.com/speed/pagespeed/), Pingdom (pingdom.com), and our personal favorite, Gtmetrix Suite (gtmetrix.com). To use these tools, simply go to their site and type in the URL you want to test. They will give you the time it takes to load your site, along with a few useful tips on how to increase your site speed.

If you follow the advice of the tools above, and you still need to speed up the speed of your website, there are a few things you can do. First, remove any content on your site that does not need to be there. Every piece of content on your site is data, whether text, images, or video, and takes time to load onto the site. Next, try optimizing images. The smaller the size of the image file, the less data the server has to send to the site visitor, so this can make a huge difference. There are many tools to optimize images, but a good one to use is optimizilla.

Third, minify the code for your site. There is often unnecessary spaces and notes in the website code, and getting rid of this space can decrease the amount of data being sent. If your site uses WordPress, there are many plugins that will minify the code for you. If not, you can use a site such as “willpeavy.com” and manually enter your code, and it will minify it for you. Another way you can speed up your site is to utilize caching. This is a little bit technical, but ultimately what it does is store the information on your site and serve the same version to each new visitor. The easiest way to utilize this is using a WordPress plugin like WPFastestCache. The last thing you can do to speed up your site is to use a content delivery network (CDN) like CloudFlare. The concept behind this is a bit technical, but if you go to their site, (cloudflare.com), they will walk you through the process.

  1. Free tools to check site health

Now that your site has all the necessary components of a great website, you can check the health of your site to see if there are any improvements you can make. There are many tools out there that will inform you of technical errors on your site, but we recommend a few specific ones:

Use these tools to identify site errors, correct them, and watch your search rankings and traffic skyrocket!

  1. Target the right keywords

Keyword research is an integral part of search engine optimization. There is no point in ranking for words that get no search traffic, or ranking for terms that will not add value to your business. Before we get too far into keyword research, there are two different terms you should know. Short tail and long tail keywords. Short tail keywords are things like “Dermatologist New York”, or “Best Dermatologist”. These keywords are often incredibly difficult to rank for, and often don’t add very much value. Long tail keywords are things like “Dermatologist that specializes in acne scar removal”, or “Orange County dermatologist open on Saturday”. These keywords are not only easier to rank for, but often convert better because there is an intent to purchase or perform an action behind it.

Now that you know that you should go after long tail keywords, how do you know which ones to go after? Ideally, you should go after keywords that have high search volume, but low competition. There are a lot of different sites that claim to have information about search volume, but the only reliable source is Google itself. Google has a tool called keyword planner that lets you find out how much search volume different keywords get, along with lots of other useful information. Use Google’s tool, find the right keywords, and include them in your content, and watch the traffic roll in.

  1. Easy Link Building

When Google was founded, the core of the search engine was based on the idea that the best way to tell if a site was quality or not was the amount of links pointing to that site from others, also called “pagerank”. Obviously, Google has come a long way since then, including hundreds, if not thousands, of ranking factors in their algorithm. Still, this is most likely the strongest ranking factor there is. Although a whole post could be written on a backlink strategy, we will just give you a few tips you can implement today to get some valuable backlinks to your site.

For a lot of businesses, a great way to attract links is to create valuable content that others will want to link to. This involves hours and hours of researching, writing, and marketing that content. For a busy dermatologist like you, writing articles is not the best use of your time. Instead. You can focus on creating links that take little effort.

Before we go into how to acquire these links and where you can get them, we should mention that not all links are created equal. There are two different types of links; dofollow, and nofollow. Nofollow links are ones that do not pass “pagerank” or “link juice”. Often, but not always, these links are found in places like blog comments and citation links. Dofollow, on the other hand, does just the opposite. These links tell the search engines that the site they are linking to has their vote of confidence and quality.

Another thing to look for in a link is the quality of the site and page that the link is coming from. Although Google officially stated that they will no longer be updating pagerank, it does not mean that they don’t analyze the quantity and quality of links. There are two good ways to analyze link quality. First, use majestic (majestic.com). Their tool will give you two different metrics; citation flow, and trust flow. The best way to analyze link quality is to use Moz’s tool. They try to replicate pagerank with their two metrics; domain authority, and page authority. You can download a Moz plugin for Google Chrome that will give you this information.

Now that you know how to analyze links, let’s look at where you can find them. The first place is social media. Sites like Twitter, Instagram, and Pinterest all give you the option to link to your website from your profile. These links are generally nofollow, but they still have value if they drive traffic. Another source of links is from citation sources. There are thousands of sites out there that list business information such as address, phone number, website, etc. A few citation sites are yellowpages, ezlocal, and superpages. Some of these links will be dofollow, some will be nofollow. The last place to look for links, and the best place to look for links, is associations. Associations like the American Academy of Dermatology often offer links to websites for members of the organization. Contact any organizations that you belong to and see if they will link to your site. These links are the most powerful because they are relevant, trusted sites with lots of backlinks.

Conclusion

We hope you have gained lots of useful information to use for improving the search engine optimization of your site. If you have any questions about anything mentioned in this article, feel free to ask the experts at DermaLead. As always, thanks for reading and good luck in your search for patients!

Leave a Reply

Your email address will not be published. Required fields are marked *