Excel is beloved by most SEO professionals – and by many paid aficionados, too.
Of all the SEO tools I use on a daily basis, few are more powerful or more versatile than spreadsheets.
While Word, PowerPoint, and Access have their little idiosyncrasies and challenges in daily work, Excel has stood the test of time and continues to be just as valuable as it has ever been.
There is simply no easier way to sort and categorize data quickly with so little input required.
I use Excel daily to sort lists:
- Keyword lists alphabetically.
- Revenue reports by most revenue earned.
- Pages by highest bounce rate.
- And every other thing you can think of or sort by.
Excel is also great for de-duplicating lists. You can easily take a huge list of data, like inbound links, and sort it and remove all the duplicates with a few clicks.
Pivot tables are phenomenal for categorizing and re-categorizing data sets with many values. Want to know how many people clicked on a particular keyword from a particular referral source by the date they clicked? There’s a pivot table for that.
And the VLOOKUP command is a poor man’s database, perfect for combining lists from multiple places quickly without having to enter all of the data into a cumbersome tool like Access.
Chances are, you already know all these tricks.
So what follows are some things you might not know you can do with Excel.
Note: While this chapter focuses on Excel, you can perform most of these tasks in any spreadsheet program, including Google Sheets.
Excel supports the * command as a wildcard.
If you want to cut off part of a data set, it’s easy to do.
For example, let’s say you have a very large list of links but you only want one sample URL from each domain.
After you find/replace for http:// and https://, you can reduce every single URL to only its top-level domain by finding and replacing for /* – the star is a wildcard that removes everything after the first backslash.
If you want the backslash left on, replace for /.
If you want the / gone, replace for an empty field.
Then you can use that de-duplication function to remove all but one example of every URL.
This one counts the number of characters in the string.
This is great for checking the length of titles or meta descriptions. But it can also be used to bring the longest string to the top.
Taking the previous example, let’s say you wanted one sample URL from each domain that you deduplicated, but you wanted an interior page instead of just the top level domain.
You’d take the same steps as above, but instead of editing the URLs in place, you’d copy them to a new column (Source Domain) and edit those instead.
Then, use LEN to find the character length of the Source URL and sort by Col A ascending and Col B descending.
That will bring the longest URL for each domain to the top of the list and then when you de-duplicate, it will save the first instance of that domain – the longest URL.
3. Text to Columns
Text to columns is another brilliant feature of Excel.
You’ve probably used it to make a .csv file useful, by delimiting the data by commas. But did you know you can actually delimit any character you want?
I use this all the time to delimit by the backslash.
This is a terrific way to take an output from Screaming Frog or another crawler and quickly build out a visual architecture map.
By putting each subfolder in its own column, you can use a pivot table to find out exactly how many pages are in each level, then use the data from the pivot table to quickly build a visual map in something like Visio.
The Find function is one of the most useful functions in Excel.
You can use it to eliminate things from lists of keywords.
For example, let’s say you have a huge list of keywords and you want to find things that only contain one version of a word (useful when categorizing).
You would start a new column, then use this formula where “word” is the word you want to find.
This will show you the character count at which that word starts, and an error if the word isn’t there at all… like this list where I looked for anything with “mixed”:
Just sort by the “Sort” column descending, and you’ll bring all the keywords that contain “mixed” to the top of the list.
Concatenation is great for putting together simple rows of data, but it’s even better for automating the creation of a lot of URLs.
Let’s say you wanted to search a lot of sites to see if a particular keyword was in text on that site… maybe you’re doing a link audit and you need to see if your client’s domain name appears anywhere on a list of sites.
You could (theoretically) concatenate a simple list of commands that you could then run in a crawler like Screaming Frog to see if there are any results.
So, hypothetically speaking, you would do this:
Take the command you need to add and put it into a row in excel.
Let’s say the site is motorola.com, and you’re looking for the word “motorola” on a lot of websites, one of which is www.cnn.com.
Put the site you are searching on in A1. You’ll use this command in Excel:
If each row in column A has a new value in it, then in theory, you could have a full set of Google commands to put in list mode in Screaming Frog.
Using the Custom configuration, you could look for any results that had “No results found for” motorola.com in the code.
This is all theoretical, of course, because scraping Google is a violation of their terms of service.