I honestly think businesses really need to hire the best SEO resource prior to any new website build.
I keep auditing websites and keep finding the same errors, some of them hugely damaging.
So many businesses are using developers who build sites using technologies that “despite” Google saying is crawlable – turns out to be a very different story.
Angular being one PITA – Predominantly because the pages do not actually exist and are created when the server sends back data client-side – for example, in an Angular application you can transition pages where the URL fragment doesn’t change but the page content does – meaning Google cannot actually crawl data because it cannot make the request – OR, if it does, there are almost always issues.
One client deployed an angular website, 3 weeks later their index size had shrunk from 16,000 pages to 150. They came to me in a panic – we had to drop in pre-render/NG Route change, but, this solved the issue.
Whilst Googlebot has gotten MUCH better at crawling and indexing dynamic content, do NOT rely on this.
Planning a new website? be careful when using “Web Application” technologies for the front-end of your website.