skip to Main Content

I’m encountering performance issues with a Laravel application deployed in a production environment. The application is designed to register views for videos, and it utilizes the firstOrNew method to retrieve or create device records for each view. However, regardless of whether the traffic is heavy or moderate, the application encounters persistent database timeouts, particularly when interacting with the devices table, which contains approximately 20 million records. Additionally, there are instances where a single request to the database takes upwards of 30 seconds to complete. It’s worth noting that the database in use is MariaDB.

public function store(Request $request){
        // define message for when view is already registered
        $mssg = 'view already exist';

        try{
            // Start database transaction
            // Retry 3 times if first attempt fails (this is done to avoid db deadlock issues)
            DB::transaction(function () use ($request, $mssg) {
                // get request data
                $ip = $request->ip();
                $videoId = $request->input('video_id');
                $hostname = $request->has('embed_source') ? $request->input('embed_source') : NULL;

                // check if view already counted
                if(View::where(['ip' => $ip, 'video_id' => $videoId])->count()) throw new Exception($mssg);

                // Register new view
                // Firstly, retrieve or create device
                $agent = new Agent();
                $position = Location::get($ip);
                $device = Device::firstOrNew(
                    [
                        'type' => $agent->isDesktop() ? 'desktop' : ($agent->isTablet() ? 'tablet' : ($agent->isPhone() ? 'phone' : 'unknown')),
                        'browser' => strtolower($agent->browser()),
                        'country' => $position && $position->countryName ? strtolower($position->countryName) : null,
                        'region' => $position && $position->regionName ? strtolower($position->regionName) : null,
                        'city' => $position && $position->cityName ? strtolower($position->cityName) : null,
                        'hostname' => $hostname,
                    ],
                    [
                        'long' => $position && $position->longitude ? $position->longitude : null,
                        'lat' => $position && $position->latitude ? $position->latitude : null,
                    ]
                );
                $device->save();

                // Secondly, save new view with device ID
                $view = new View();
                $view->video_id = $videoId;
                $view->device_id = $device->id;
                $view->ip = $ip;
                $view->watched_duration = 1;
                $view->save();
                // Lastly, increment views count in videos table
                Video::where('id', $videoId)->increment('total_views');

            }, 3);
            
            return response(['viewed' => true], 200);
            
        }catch(Throwable $e){
            if($e->getMessage() == $mssg) return response(['message' => $mssg], 200);

            info($e->getMessage(), [
                'title' => 'COULD NOT REGISTER VIEW',
            ]);
            return response(['error' => $e->getMessage()], 500);
        }
    }

I’m uncertain about the root cause of this issue and how best to resolve it.

I’m seeking guidance on how to optimize the code to improve performance and prevent database timeouts when trying to read from the devices table. Specifically, I’m looking for recommendations on alternative approaches to retrieving or creating device records, optimizing database queries, and scaling the application to handle the current workload efficiently.

Any insights, suggestions, or best practices would be greatly appreciated. Thank you in advance for your assistance!

3

Answers


  1. your question is unclear, are you facing issue while fetching 20 million data or you are trying to insert 20 million data at one go, apart from this what I understand is your one request is taking around 30 sec to respond, which is a lot for any user, for that you can use Jobs/Queue in laravel so that what ever data inserting part is happening will happen in backend and your server will also not load too much

    Login or Signup to reply.
  2. Here are some (possible) improvements:

    1. Don’t use count() to check existence of a record, use exists()

    2. Make a composite unique index for type, browser, country, region, city, hostname columns in database or create a unique hash column and check against hash of these values.

    3. If you added separate indexes for these columns, remove that indexes. This could slow down your write operations.

    Login or Signup to reply.
  3. Handling a data of 20 million records can be time-consuming, and performance optimization becomes crucial. You can consider the following tips:

    1. Database indexing: Ensure that the columns used in your query conditions are indexed. This will speed up the search process significantly. In your case, consider indexing columns like type, browser, country, region, city, and hostname.
    2. Horizontal partitioning: you have to divide the database based on a shard key, which is often a single column used to find the corresponding shard. But I think in your case you need to partition data based on multiple columns, like type and country.
    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search