#201 Working with Large Data

Summary

When database tables get large, things can slow down significantly. In this episode, we look at a few things which helps keep databases performant and the client side responsive.
rails database 6.0 15:42

Summary

models/vehicle.rbclass Vehicle < ApplicationRecord
  has_many :vehicle_trackers, dependent: :delete_all
end
20190804015112_create_vehicle_trackers.rbclass CreateVehicleTrackers < ActiveRecord::Migration[6.0]
  def change
    create_table :vehicle_trackers, { id: false } do |t|
      # t.belongs_to :vehicle, null: false, foreign_key: true
      t.bigint :vehicle_id, null: false, foreign_key: true
      t.decimal :latitude, precision: 10, scale: 6
      t.decimal :longitude, precision: 10, scale: 6
      t.boolean :speeding, default: false
      t.boolean :maintenance_required, default: false
      t.datetime :recorded_on
    end
    # add_index :vehicle_trackers, :id
    # add_index :vehicle_trackers, :vehicle_id
    add_index :vehicle_trackers, [:vehicle_id, :recorded_on]
  end
end
db/seeds.rbdef vehicle_tracker_record(id)
  {
    vehicle_id: id,
    latitude: Faker::Address.latitude,
    longitude: Faker::Address.longitude,
    speeding: [true, false].sample,
    maintenance_required: [true, false].sample,
    recorded_on: rand(10.years).seconds.ago
  }
end

100.times do
  Vehicle.all.each do |vehicle|
    start_time = Time.now.to_i
    records = [].tap do |array|
      10_000.times do
        # VehicleTracker.create(vehicle_tracker_record(vehicle.id))
        array << vehicle_tracker_record(vehicle.id)
      end
    end
    VehicleTracker.insert_all(records)
    end_time = Time.now.to_i
    puts "Created 10,000 logs in #{end_time - start_time}s"
  end
end



mcfoton said 17 days ago on Working with Large Data :

Thanks a lot for the episode!
Tip: instead of Time.now (in the seed file) you can use Time.current, that automatically takes zone into account if you have any set up in the app :)
https://apidock.com/rails/Time/current/class

Uriel PRO said 14 days ago on Working with Large Data :

Thank you for the episode, greats!

Login to Comment