Week ending April 12th, 2019

This feels like a historic week. On April 9th we celebrated my firstborn sons 5th birthday, and it’s crazy how fast that has gone by. He can now make up crazy new stories, create very unique lego vehicles, convince his brother to do his dirty work, and help out with chores once in a while. April 9th is also significant for my business and family because that is the 4 year anniversary of me accepting a job offer from Faith Growth to work 20 hours a week as a freelance developer and leave my full-time position at an agency. I stayed on as their part-time lead WordPress developer for over a year and a half. Not having to commute anymore and being able to work from home all the time is great for the family. Having the potential to make 2-3x what I was making at an agency, with schedule flexibility, is wonderful. Over the last 4 years I haven’t always known if I was going to have enough billable work to do in the week, but it’s been steady enough that my family has never had to worry that we won’t be able to pay the bills or have food to eat. I love what I do and really appreciate the clients, bosses, mentors, and coworkers that have helped me get here.

Practical Ecommerce

I have let my clients know again and again that I am not an SEO expert, a Google Analytics expert, a Google Tag Manager expert, or a Google Adwords expert, but that doesn’t stop me from always needing to learn more about how those elements affect a website’s search engine performance, financial well-being, and page loading speed. This week we dug into sitemap issues with Practical Ecommerce because my client was concerned that Google was crawling old sitemaps and maybe spending too much effort following redirects from old URLs to new URLs.

As part of the investigative process, the client bought me a license for the Screaming frog SEO Log File Analyser. I’ve used Screaming frogs SEO Spider briefly in the past but have never looked at their Log analyser tool.

I logged into the hosting control panel, downloaded the (very large) raw access logs for the site, and opened them up in the Analyser. The tool filters the site traffic down to just the URLs that search engine bots have accessed, so that I can get a better idea of how bots are crawling the site and where there may be problems. I can type in the slug for a page like “amazon-ecommerce” and see any bot traffic with that in the URL. This means if bots went to https://domain.com/amazon-ecommerce, https://domain.com/amazon-ecommerce/page/500, or https://domain.com/category/marketing/amazon-ecommerce, or https://domain.com/5342342-amazon-ecommerce they all show up as separate items. I can see how many times the URL was visited, which bots visited it (Google, Bing, etc), and the HTTP status of the request (200 okay, 404 not found, 301 redirect, etc).

It’s fun to go through and see the different URLs that bots will visit and then try to figure out how they even found that URL in the first place. We checked that any redirects are happening efficiently and that Googlebot wasn’t going down a long rabbit trail of redirects. Looking at our sitemaps that Google was crawling in Search Console, we discovered that there was an old sitemap from a few years ago they had referenced. The sitemap had not changed in years and it was from before the site had moved from HTTP to HTTPS URLs, so every link to an article was to the old HTTP version of the URL. This means that when Google would visit the links in that sitemap it would first try the HTTP URL and then follow a 301 redirect to the new secure URL. Once we found this, we quickly deleted the old sitemap and told Google Search Console to not use it any more. Now that we have the tools in place for analysing the log files, I am sure that there will be more detective work to be done in the future, when Google informs us of coverage issues.

Wrapping up

I also spent a few more days building Gutenberg blocks and refactoring some of the blocks I created over the last month. A couple of the blocks had very similar functionality with sortable posts and a post selection tool, so the prudent thing to do was moving common functionality out of the individual block folder and into a common components folder. Each block is now easier to maintain, and if we need to improve the functionality of the common components we can change it quickly in one spot instead of multiple files. After the abstraction of common components, I was able to create a new similar block even faster than I had in the past. Going back to clean up code from a few weeks before and improving upon it was definitely a worthwhile investment.





The complete Work Journal series:
1. Week ending January 25th, 2019
2. Week ending February 1st, 2019
3. Week ending February 8th, 2019
4. Week ending February 15th, 2019
5. Week ending February 22nd, 2019
6. Week ending March 1st, 2019
7. Week ending March 8th, 2019
8. Week ending March 15th, 2019
9. Week ending March 22nd, 2019
10. Week ending March 29nd, 2019
11. Week ending April 5th, 2019
12. Week ending April 12th, 2019
13. Week ending April 19th, 2019
14. Week ending August 9th, 2019
15. Week ending September 20th, 2019
16. Week ending September 27th, 2019
17. Week ending December 6th, 2019
18. Week ending October 2nd, 2020
19. Week ending April 2nd, 2021
20. Coding API integrations in Twilio Studio - Work Journal May 8, 2021
21. Trudging through a complex theme implementation - Work Journal October 29, 2021
22. Creating custom Duda widgets - Work Journal December 10, 2021
23. My first Laravel Nova project - Work Journal December 1, 2023
24. Let's talk about Statamic - Work Journal January 12, 2024