#201 Working with Large Data

Summary

When database tables get large, things can slow down significantly. In this episode, we look at a few things which helps keep databases performant and the client side responsive.
rails database 6.0 15:42

Summary

models/vehicle.rbclass Vehicle < ApplicationRecord
  has_many :vehicle_trackers, dependent: :delete_all
end
20190804015112_create_vehicle_trackers.rbclass CreateVehicleTrackers < ActiveRecord::Migration[6.0]
  def change
    create_table :vehicle_trackers, { id: false } do |t|
      # t.belongs_to :vehicle, null: false, foreign_key: true
      t.bigint :vehicle_id, null: false, foreign_key: true
      t.decimal :latitude, precision: 10, scale: 6
      t.decimal :longitude, precision: 10, scale: 6
      t.boolean :speeding, default: false
      t.boolean :maintenance_required, default: false
      t.datetime :recorded_on
    end
    # add_index :vehicle_trackers, :id
    # add_index :vehicle_trackers, :vehicle_id
    add_index :vehicle_trackers, [:vehicle_id, :recorded_on]
  end
end
db/seeds.rbdef vehicle_tracker_record(id)
  {
    vehicle_id: id,
    latitude: Faker::Address.latitude,
    longitude: Faker::Address.longitude,
    speeding: [true, false].sample,
    maintenance_required: [true, false].sample,
    recorded_on: rand(10.years).seconds.ago
  }
end

100.times do
  Vehicle.all.each do |vehicle|
    start_time = Time.now.to_i
    records = [].tap do |array|
      10_000.times do
        # VehicleTracker.create(vehicle_tracker_record(vehicle.id))
        array << vehicle_tracker_record(vehicle.id)
      end
    end
    VehicleTracker.insert_all(records)
    end_time = Time.now.to_i
    puts "Created 10,000 logs in #{end_time - start_time}s"
  end
end



mcfoton said 3 months ago on Working with Large Data :

Thanks a lot for the episode!
Tip: instead of Time.now (in the seed file) you can use Time.current, that automatically takes zone into account if you have any set up in the app :)
https://apidock.com/rails/Time/current/class

Uriel PRO said 3 months ago on Working with Large Data :

Thank you for the episode, greats!

Olaoluwa Afolabi said about 2 months ago on Working with Large Data :

In a case where you want to update VehicleTracker table such that you find a record with VehicleTracker.find(params[:id]), how will you go about it?

Olaoluwa Afolabi said about 2 months ago on Working with Large Data :

The reason am asking is that you have suppressed :id from being created on VehicleTrackers table, In a case where you have vehicle_id column relation on VehicleTracker which is the same id for any record created by Vehicle on VehicleTracker table, how do you have a unique params[:id] to filter and update a table?

kobaltz PRO said about 2 months ago on Working with Large Data :

yes, if you are having to modify the table records like that then a bigint as the primary key or a uuid would be necessary.

Olaoluwa Afolabi said about 2 months ago on Working with Large Data :

Okay, so something like this will do:

class CreateEntries < ActiveRecord::Migration[6.0]
  def change
    create_table :entries, primary_key: :user_id, id: false do |t|
      t.bigint :user_id, null: false, foreign_key: true
      t.string :purpose, null: false
      t.datetime :time_in
      t.datetime :time_out

      t.timestamps
    end
    add_index :entries, %i[user_id purpose time_in time_out]
  end
end

kobaltz PRO said about 2 months ago on Working with Large Data :

just remove the id set to false. you wouldnt want to make the user id the primary key

Login to Comment