Our boss recently asked our dev team to build what he calls a "splash page" for the home page of our web site. Not the stand-alone, redirecting splash page of old. Instead, a home-page visitor would at first see nothing but a hard-sell, call-to-action "splash" message with two buttons: a mailto
link and an "Enter Site" button. If the "Enter Site" button is clicked, the hard-sell message goes away the the "regular" home-page content is revealed.
The solution devised by a team member is to have a #hardsell
div and a #main
content div (which includes the site header and navigation). Both divs have display:none
set with inline CSS.
Visibility between the two divs is toggled using sessionStorage
. This avoids having to set a cookie.
function splashCheck() {
var readValue = sessionStorage['myvariable'];
if (readValue === "splash") {
$("#main").css( "display", "block" );
} else {
$("#hardsell").css( "display", "block" );
var myVariable = "splash";
sessionStorage['myvariable'] = myVariable;
}
}
splashCheck()
This is admirably simple, but I am concerned about two things:
-
Having
display: none
on the#main
and#hardsell
divs means there’s no visible content until thatsplashCheck()
function fires. You see nothing if javascript is disabled. -
SEO: What will Google bots and other crawlers see on this page? Nothing? (Bad!) Only the
#hardsell
content? (Also bad!) The#main
content? (Preferred!) Both sets of content? (Acceptable!)
I cannot find a definitive answer as to what Google will and will not crawl in this case. Google bots will supposedly crawl hidden content like accordions, but Google docs also say that bots "may have" issues with dynamic hidden sections.
Are my concerns baseless? Is there a way to do this with sessionStorage that lets Google crawl the #main
content while showing visitors just the #hardsell
on initial visit?
3
Answers
Answer to my poorly composed question is "not bad if JavaScript detection is leveraged." As @EmielZuurbier suggested in comment on the initial question:
Your question could be formulated to: "How does javascript content affect SEO".
The same question is often asked when working with SPA such as react or angular
As @Emiel Zuurbier comments, "Disable your CSS and JavaScript, and refresh the page. That’s what Google will see.". Is not 100% correct, do you really think googles crawlers cannot read javascript content? We’ll kind of, js content makes it hard for crawlers to read content.
To answer your question:
I suggest you read this article
https://www.magnolia-cms.com/blog/spa-seo-mission-impossible.html#:~:text=SPAs%20are%20not%20inherently%20friendly,play%20nicely%20with%20Google's%20crawler.
TLDR;
If it will be built with a single-page application framework (SPA), I would suggest you go for a server-side rendering (SSR) implementation instead say with NextJS or Nuxt, which will serve actual HTML content the web crawlers can index.
The bottom line is once you are serving up the content in pure HTML, which is crawlable you are fine.
There are plans to make SPAs indexable in the long run, but at this time of writing it is better to stick with the most effective (Plain HTML with the metas and SEO related metas included) or an SSR implementation.
SPA (NO HTML content for web crawlers)
SSR (Specifically this is with Next) – Content for web crawlers
Vanilla, HTML/CSS/JS – Content for web crawlers