My website (https://blackmaps.com.ar) is supposed to display rich results, as I’ve added JSON-LD schemas to every page, and the Schema.org validator detects them correctly. However, for some reason, Google’s Rich Results Test is not recognizing them.
I’m currently using NextJS 14.2.12 and this is the async function that generates the schema for each page (in this case homepage)
export async function generateSchemas(t, locale) {
const baseUrl = "https://blackmaps.com.ar"
const schema_url = locale && locale !== 'default' ? `${baseUrl}/${locale}` : baseUrl;
return {
"@context": "https://schema.org",
"@type": "Organization",
"name": "Black Maps",
"description": t('schema_description'),
"founder": {
"@type": "Person",
"name": "Agustín Sánchez"
},
"image": "https://blackmaps.com.ar/image/og-home.png",
"url": schema_url,
"sameAs": ["https://x.com/maps_black"],
"logo": "https://blackmaps.com.ar/image/app-icon-1024.webp",
"ContactPoint": {
"@type": "ContactPoint",
"email": "[email protected]",
"contactType": "Customer Service",
"availableLanguage": ["Spanish", "English"]
}
}
}
and this is how its loaded in the page:
// page code
const schemas = await generateSchemas(t, locale);
const schemaJSON = JSON.stringify(schemas);
// more page code
<Script
id="org-schema"
type="application/ld+json"
dangerouslySetInnerHTML={{
__html: JSON.stringify(schemaJSON),
}}
/>
// more page code
I’m not an SEO expert, so I would really appreciate any help or guidance. Thank you!
2
Answers
Check the Rich Results Test More info section. You will see that you are blocking Googlebot from accessing a lot of JavaScript resources. I suspect that includes the one with your structured data code.
Review your robots.txt file to make sure it does not disallow resources Google needs to add the structured data, and any other visible content.
It could be a cache issue. Maybe Google is crawling an older version of your page. Are you using Cloud Flare?