Giter Site home page Giter Site logo

Slow syncing about laravel-mediable HOT 2 CLOSED

martinEtflais avatar martinEtflais commented on June 11, 2024
Slow syncing

from laravel-mediable.

Comments (2)

martinEtflais avatar martinEtflais commented on June 11, 2024 1

Thx for the reply. I came up with the solution which dropped the execution time drastically by not using syncMedia method at all (as you are suggesting :-) ). First thing was to introduce a dirty column to mediable table marking all the records as dirty by the start of syncing. Then I collect all the mime types size etc to the array chunk that array by like 5000 records and make query builder upsert (not eloquent) updating dirty column to 'N' and finally delete all records which remains dirty. I realized this is the common problem with eloquent models in general when dealing with large datasets. But Im still using syncMedia method in admin panel as its working really nicelly when dealing with one record at the time :-) hope this helps someone facing the same problem and sorry for my bad english i hope this all makes some sense :-D

from laravel-mediable.

frasmage avatar frasmage commented on June 11, 2024

Hi @martinEtflais,

A few thoughts:

  • The MediaUploader is designed for ease of use for simple upload scenarios, syncing 180k in a single process is definitely more complex use case. It can absolutely work, but as you noted it is a little slower as there is a minor cost to the abstractions that make it easy to use, which adds up at that scale. You are probably better off writing a script that handles the uploader tasks that you actually need a way that is designed for your scale (as I think you have already done)
  • Do you really need to resync every single file every few hours? Are you not able to compare a file hash/checksum to see if the contents of the file have changed and skip it if it hasn't? That should cut the execute time down a lot
  • The Eloquent save() method (which MediaUploader uses) will execute a separate INSERT/UPDATE query per row, requiring a roundtrip to the database for each, which adds up if saving hundreds of thousands of rows one by one. If you batch your process and perform a bulk upsert , you can drastically cut down on network time.

from laravel-mediable.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.