skip to Main Content

I am currently changing the page title and (in the near future) meta data via Vue router like below:

$route (to, from){
    document.title = to.meta.title
}

This works fine when I inspect the title:

    <html>
    <head>
        <meta charset="utf-8">
        <meta http-equiv="X-UA-Compatible" content="IE=edge">
        <meta name="viewport" content="width=device-width, initial-scale=1">
        <title>Updated Page Title</title>
        <link href=" /css/app.css?id=b21c63ebd0cb0655a4d8" rel="stylesheet">
    </head>

However when I view page source it shows the old info:

    <html>
    <head>
        <meta charset="utf-8">
        <meta http-equiv="X-UA-Compatible" content="IE=edge">
        <meta name="viewport" content="width=device-width, initial-scale=1">
        *** <title>Old Page Title</title> ***
        <link href=" /css/app.css?id=b21c63ebd0cb0655a4d8" rel="stylesheet">
    </head>

I need both the title and the meta info to be dynamic for SEO purposes. My question is two-fold:

  1. Will using vue-meta plugin solve the page title issue from original source?

  2. Do I have to use something like SSR or pre-rendering to get dynamic meta info or is their any other route similar to this approach to do it?

2

Answers


  1. Chosen as BEST ANSWER

    I ended up using this package supplied by hatef for a psuedo workaround without full SSR meta editing: https://github.com/butschster/LaravelMetaTags.

    I implemented a middleware layer to check which domain is being used (multi-domain application) and then dynamically append meta titles based on the given endpoint from vue router:

    <?php
    
    namespace AppHttpMiddleware;
    use IlluminateSupportFacadesLog;
    use IlluminateRoutingUrlGenerator;
    use ButschsterHeadFacadesMeta;
    use AppModelsLocalDomains;
    
    use Closure;
    
    class ValidateSEO
    {
        /**
         * Handle an incoming request.
         *
         * @param  IlluminateHttpRequest  $request
         * @param  Closure  $next
         * @return mixed
         */
        public function handle($request, Closure $next)
        {
            $tempSplit = preg_split('/[//]/', url()->current());
            $domain = Domains::where('domain', $tempSplit[2])->first();
    
            //length specific check based on site requirements and endpoints
            if($domain && count($tempSplit) <= 4){
                //set meta data based on brand site visited
                $linkedBrands = $domain->brands()->get();
                $selectedBrand = [];
                foreach($linkedBrands as $brand){
                    $selectedBrand = $brand;
                    break;
                }
                if(isset($selectedBrand['name'])){
                    Meta::setTitle($selectedBrand['name'] . ' | Guide')
                    ->setDescription($selectedBrand['description'])
                    ->setKeywords(['sample' , 'keywords', 'here']);
                } else {
                    Meta::setTitle(' Guide')
                    ->setKeywords(['sample' , 'keywords', 'here']);
                }
                
            } else {
                //set meta data based on admin endpoint
                Meta::setTitle('Admin | ' . ucfirst($tempSplit[4]));
            }
    
            return $next($request);
        }
    }
    

    • vue-meta is only applied when your JavaScript is executed and your page is rendered. So, no you are not going to see those meta tags when you view page source.

    • Do you need SSR or pre-rendering? Maybe. That depends on what you want to achieve. If having a great SEO until web crawlers support JS is crucial for your website? Then, yes.

    Of course, you can still add some of the meta tags in the backend. Depending on the language/framework you use, there are plenty of options that can help to achieve that. For example, for Laravel you can check this package out.

    Another workaround for this problem, is to categorize your requests into the ones coming from frontend and those coming from crawlers. You can do so by for example inspecting the user agent in the request, and then you could adjust the response for crawlers (like, injecting the meta tags into header) accordingly.


    Here is an example of the workaround I suggested:

    IndexController.php

    <?php
    
    declare(strict_types=1);
    
    namespace AppHttpControllers;
    
    use ButschsterHeadFacadesMeta;
    use ButschsterHeadPackagesEntitiesOpenGraphPackage;
    
    class IndexController extends Controller
    {
        const CRAWLERS = [
            'Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2272.96 Mobile Safari/537.36 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)',
            'Mozilla/5.0 (iPhone; CPU iPhone OS 9_1 like Mac OS X) AppleWebKit/601.1.46 (KHTML, like Gecko) Version/9.0 Mobile/13B143 Safari/601.1 (compatible; AdsBot-Google-Mobile; +http://www.google.com/mobile/adsbot.html)',
            'Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; Googlebot/2.1; +http://www.google.com/bot.html) Safari/537.36',
            'Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)',
            'Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)',
            'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/534+ (KHTML, like Gecko) BingPreview/1.0b',
            'Mozilla/5.0 (iPhone; CPU iPhone OS 7_0 like Mac OS X) AppleWebKit/537.51.1 (KHTML, like Gecko) Version/7.0 Mobile/11A465 Safari/9537.53 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)',
            'Googlebot-Image/1.0',
            'Mediapartners-Google',
            'facebookexternalhit/1.1 (+http://www.facebook.com/externalhit_uatext.php)',
            'facebookexternalhit/1.1',
            'Twitterbot/1.0',
            'TelegramBot (like TwitterBot)',
        ];
    
        public function index()
        {
            if ($this->isACrawler()) {
                $this->applyMetaTags();
    
                return view('layouts.crawler');
            }
    
            return view('layouts.index');
        }
    
        public function isACrawler()
        {
            if (in_array(request()->userAgent(), self::CRAWLERS)) {
                return true;
            }
    
            return false;
        }
    
        private function applyMetaTags()
        {
            $title = 'Something';
            $description = 'Something else';
    
            Meta::prependTitle($title)
                ->addMeta('description', ['content' => $description]);
    
            $og = new OpenGraphPackage('some_name');
    
            $og->setType('Website')
                ->setSiteName('Your website')
                ->setTitle($title)
                ->setUrl(request()->fullUrl())
                ->setDescription($description);
            
            Meta::registerPackage($og);
        }
    }
    

    layouts/crawler.blade.php

    <!DOCTYPE html>
    <html lang="{{ str_replace('_', '-', app()->getLocale()) }}">
        <head>
            @meta_tags
    
            <link rel="shortcut icon" href="{{ asset('favicon.ico') }}">
        </head>
    </html>
    

    So as you see, I’m creating a list of crawlers, then checking the request to see if it’s coming from a crawler and if true apply the metatags and return the specific layout I created for this purpose.

    Caveats:

    • This doesn’t apply the metatags to your normal requests.
    • You need to find a way to filter the request URL and apply the tags dynamically. For example using RegEx.
    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search