It’s more important in 2019 than ever to ensure that your website’s SEO is in top shape. Whether a small-time freelancer or apex marketing firm, following these ten tips will lead to SEO success for your college and universities.
The latest marketing trends are important but search engines can change their algorithms overnight. It can be tough to keep up if you’re not paying attention. If you fail to stay on top of SEO and search changes, you may fall to the second page of search results, banished from all ROI clicks. To stay relevant and competitive use these #10 SEO tips.
Think like a user and search for your programs, admissions department, or even an event. Do the right results come up? This is one of the best ways to determine if you need an SEO strategy. From there, you can identify keywords and phrases to actively monitor and develop strategies for. Remember, since many users search using voice commands, include phrases and questions in your tests.
Use location indicators in title tags
Your HTML title serves two purposes: First, it can help indicate your physical location, such as Denver, Colorado. If a prospective student searches for a college program without indicating a location, Google will try to serve local classes first. It can also help rankings when looking for programs in Denver. Second, it’s the title shown in search results. Ranking #1 is not the only factor; your title needs to confirm to the user that clicking to the site will be valuable. If you have multiple cities or locations, develop an SEO strategy to manage that.
Make sure your DNS is clean
Without you even knowing it, you could be telling Google you have multiple sites (www.yoururl.com, yoururl.com, with and without https). Remember, you should have one main site with the rest redirecting to it. You can test this by typing in the four combinations previously listed. Do three of them redirect to one master? Now, do you have any other URLs that the site is known as? Does each of those also redirect?
Switching your HTTP to a security protocol can boost your search engine rankings. In 2014, Google announced that HTTPS is a ranking factor in its search algorithm. Perform any type of Google search and you’ll notice that nearly all of the Page 1 results begin with an HTTPS URL. Google has also indicated that a HTTPS site can serve as a tie-breaker between two sites offering similar information.
Quickly test your websites security and know its status by typing https://www.yoururl.com to see if Secure or Not Secure appears in the browser bar.
Check your site for speed
To give your website a faster load speed, optimize all photos. If you have a content management system, I strongly recommend minimizing the use of plugins. And when possible, code styles into the theme and create other CSS elements to carry out your brand.
Test your website speed by using one of the following free resources:
Claim (and optimize!) your locations in Google MyBusiness
Google MyBusiness (GMB) is a free listing from Google that business can use to manage their company’s information across Google Search and Google Maps. Keeping your business information—specifically your name, address and phone number (NAP)—accurate and up to date is crucial if you want potential customers to find your company since it provides Google with the correct information to display.
Through an algorithm, Google uses hundreds of factors to determine what sites show up in local search results. Here are a few of the primary ones:
- Distance: How close the business is to the searcher or the area where the searcher is located
- Prominence: How well-known or popular the business/website is based on information Google pulls from the internet
Make sure you’re monitoring your reviews and all user questions and answers. Keep information current and feature enrollment links and texting tools so users can contact you.
Check your robots.txt file
Robot.txt is often overlooked, but it’s the vital link between your website getting indexed and receiving strong traffic. Within your robots.txt file you have the ability to disallow critical information you do not want Google to find and index.
Tip: Robots.txt can be accidently copied from an internal staging server that would unintentionally tell Google not to index your site. Check this frequently and after every major deployment or upgrade. Here’s the basic format:
- User-agent: [user-agent name]
- Disallow: [URL string not to be crawled]
Learn more about creating your robots.txt from Google.
Take advantage of Google Webmaster Tools
Google Webmaster Tools is free online software to help you track your web page index results, keyword analytics, sitemaps and more.
Establish a healthy XML sitemap
The word “sitemap” can be confusing because it’s used to describe several different, yet important elements of a website. On your site, you should have a hidden XML sitemap page. This page, not visible in site search or the navigation, is designed as a tool to help search engines. Once you submit your sitemap to Google Webmaster Tools, Google will crawl your website using its many bots and create a text outline of the site (also known has a sitemap). You can also use free online XML sitemap generators such as XML Sitemaps. Remember, you don’t want to submit utility pages and you can also create sub category sitemap pages.
Make metadata a priority
Meta descriptions provide concise summaries of webpages that are used by the reader to determine which search result to click. The closer the result matches the original user’s intent, the higher the chance of a successful click and engagement. These short descriptions appear underneath the blue clickable links in a search engine results page (SERP). However, depending on a user’s query, Google may pull meta description text from other areas of your page in an attempt to better answer the searcher’s query.
Here are two of the of the most important metadata guidelines:
- Put the most important information first.
- Do not use the same descriptions for multiple pages.