Sequel
Sequel is "a simple, flexible, and powerful SQL toolkit for Ruby" developed as a highly regarded database abstraction library in the Ruby ecosystem. With the concept of "efficient interaction with databases," it combines intuitive DSL with SQL flexibility, supporting major databases like PostgreSQL, MySQL, and SQLite. Despite its lightweight design, it comprehensively provides features necessary for full-scale application development including model functionality, migrations, connection pooling, and transaction management.
GitHub Overview
jeremyevans/sequel
Sequel: The Database Toolkit for Ruby
Topics
Star History
Library
Sequel
Overview
Sequel is "a simple, flexible, and powerful SQL toolkit for Ruby" developed as a highly regarded database abstraction library in the Ruby ecosystem. With the concept of "efficient interaction with databases," it combines intuitive DSL with SQL flexibility, supporting major databases like PostgreSQL, MySQL, and SQLite. Despite its lightweight design, it comprehensively provides features necessary for full-scale application development including model functionality, migrations, connection pooling, and transaction management.
Details
Sequel 2025 edition boasts over 15 years of development experience and mature APIs as the standard library for database operations in Ruby. Unlike ActiveRecord, it adopts a lightweight approach while providing a beautiful DSL that is close to SQL yet Ruby-like. The separated design of Datasets and Models achieves both flexibility and maintainability. Complex queries can be written intuitively, and mixing raw SQL with DSL is possible. With thread-safe connection pooling, rich plugin system, SQL injection prevention, and other features, it has the robustness to withstand production environment usage.
Key Features
- Intuitive DSL: SQL-like yet Ruby-esque beautiful query description
- Lightweight Design: Lighter and faster performance than ActiveRecord
- Rich Database Support: Supports PostgreSQL, MySQL, SQLite, Oracle, etc.
- Flexible Architecture: Dataset and Model separation, raw SQL mixing possible
- Plugin Mechanism: Feature extension through rich plugins
- Thread-Safe: Connection pooling and safe concurrent processing
Pros and Cons
Pros
- Low learning cost with intuitive DSL close to SQL
- Lighter and faster than ActiveRecord with better memory efficiency
- High readability with beautiful description of complex queries
- Achieves both flexibility and optimization through mixing raw SQL and DSL
- Freely extensible functionality through rich plugins
- Thoroughly tested, highly stable, and maintained long-term
Cons
- Not as widely adopted as ActiveRecord, with limited information
- Rails integration not as tight as ActiveRecord
- Requires more explicit configuration than Convention over Configuration
- May lag behind ActiveRecord in some advanced features
- Smaller ecosystem compared to ActiveRecord-centric Rails
- May be less approachable for beginners than ActiveRecord
Reference Pages
Code Examples
Installation and Basic Setup
# Gemfile
gem 'sequel', '~> 5.75'
gem 'pg' # For PostgreSQL
gem 'mysql2' # For MySQL
gem 'sqlite3' # For SQLite
# Basic connection setup
require 'sequel'
# PostgreSQL connection
DB = Sequel.connect('postgres://user:password@localhost/my_database')
# MySQL connection
# DB = Sequel.connect('mysql2://user:password@localhost/my_database')
# SQLite connection (development/test environment)
# DB = Sequel.connect('sqlite://my_database.db')
# Connection with options
DB = Sequel.connect(
adapter: 'postgres',
host: 'localhost',
database: 'my_database',
user: 'username',
password: 'password',
max_connections: 10,
pool_timeout: 5
)
Basic Query Operations
require 'sequel'
# Database connection
DB = Sequel.connect('sqlite://test.db')
# Table creation
DB.create_table :users do
primary_key :id
String :name, null: false
String :email, unique: true
Integer :age
DateTime :created_at, default: Sequel::CURRENT_TIMESTAMP
DateTime :updated_at
end
# Basic dataset operations
users = DB[:users]
# Data insertion
user_id = users.insert(
name: 'John Doe',
email: '[email protected]',
age: 30,
created_at: Time.now
)
# Multiple data insertion
users.multi_insert([
{ name: 'Jane Smith', email: '[email protected]', age: 25 },
{ name: 'Bob Johnson', email: '[email protected]', age: 35 },
{ name: 'Alice Brown', email: '[email protected]', age: 28 }
])
# Data retrieval
all_users = users.all
puts "All users: #{all_users}"
# Conditional search
adult_users = users.where(Sequel[:age] >= 30).all
young_users = users.where { age < 30 }.all
specific_user = users.where(email: '[email protected]').first
# Complex conditions
users_in_age_range = users.where(age: 25..35).all
users_with_pattern = users.where(Sequel.like(:name, '%John%')).all
# Sorting
sorted_users = users.order(:age).all
desc_sorted = users.order(Sequel.desc(:created_at)).all
# Limit and offset
paginated = users.limit(10).offset(20).all
Advanced Queries and JOIN Operations
# More complex table structure
DB.create_table :posts do
primary_key :id
String :title, null: false
String :content, text: true
foreign_key :user_id, :users
DateTime :published_at
DateTime :created_at, default: Sequel::CURRENT_TIMESTAMP
end
DB.create_table :comments do
primary_key :id
String :content, null: false
foreign_key :post_id, :posts
foreign_key :user_id, :users
DateTime :created_at, default: Sequel::CURRENT_TIMESTAMP
end
# Dataset definitions
users = DB[:users]
posts = DB[:posts]
comments = DB[:comments]
# JOIN operations
# INNER JOIN
posts_with_users = posts
.join(:users, id: :user_id)
.select(Sequel[:posts][:title], Sequel[:users][:name])
.all
# LEFT JOIN
all_posts_with_users = posts
.left_join(:users, id: :user_id)
.select_all(:posts)
.select_append(Sequel[:users][:name].as(:author_name))
.all
# Complex JOIN
posts_with_comment_count = posts
.left_join(:comments, post_id: :id)
.group(Sequel[:posts][:id])
.select(
Sequel[:posts][:title],
Sequel[:posts][:content],
Sequel.count(Sequel[:comments][:id]).as(:comment_count)
)
.all
# Subqueries
popular_posts = posts.where(
id: comments
.group(:post_id)
.having { count.function.* > 5 }
.select(:post_id)
).all
# Window functions
ranked_posts = posts
.select(
:id,
:title,
:user_id,
Sequel.function(:row_number).over(
partition: :user_id,
order: Sequel.desc(:created_at)
).as(:rank)
)
.all
# Aggregate functions
stats = users
.select(
Sequel.count(:id).as(:total_users),
Sequel.avg(:age).as(:average_age),
Sequel.min(:age).as(:youngest),
Sequel.max(:age).as(:oldest)
)
.first
Model Definition and ORM Features
# Sequel model definition
class User < Sequel::Model
# Explicit table name specification (optional)
set_dataset :users
# Plugin loading
plugin :timestamps, update_on_create: true
plugin :validation_helpers
plugin :json_serializer
# Associations
one_to_many :posts
one_to_many :comments
# Validations
def validate
super
validates_presence [:name, :email]
validates_unique :email
validates_format /\A[^@\s]+@[^@\s]+\z/, :email
validates_integer :age, message: 'Age must be numeric'
validates_min_length 2, :name
end
# Instance methods
def full_profile
"#{name} (#{age} years old) - #{email}"
end
# Class methods
def self.adults
where(Sequel[:age] >= 20)
end
def self.by_age_group(min_age, max_age)
where(age: min_age..max_age)
end
end
class Post < Sequel::Model
plugin :timestamps
plugin :validation_helpers
many_to_one :user
one_to_many :comments
def validate
super
validates_presence [:title, :content, :user_id]
validates_min_length 5, :title
validates_min_length 10, :content
end
def published?
!published_at.nil?
end
def publish!
update(published_at: Time.now)
end
# Scope-like class methods
def self.published
exclude(published_at: nil)
end
def self.recent(days = 7)
where(Sequel[:created_at] >= Date.today - days)
end
end
class Comment < Sequel::Model
plugin :timestamps
plugin :validation_helpers
many_to_one :post
many_to_one :user
def validate
super
validates_presence [:content, :post_id, :user_id]
validates_min_length 3, :content
end
end
# Model usage examples
# User creation
user = User.create(
name: 'John Smith',
email: '[email protected]',
age: 28
)
# Post creation
post = user.add_post(
title: 'How to use Sequel',
content: 'Sequel is an amazing Ruby ORM...'
)
# Comment addition
comment = post.add_comment(
content: 'Very helpful, thank you!',
user: user
)
# Data retrieval via associations
user_posts = user.posts
post_comments = post.comments
comment_author = comment.user
# Conditional associations
published_posts = user.posts_dataset.published.all
recent_comments = post.comments_dataset.where(Sequel[:created_at] >= Date.today - 7).all
Transaction Management and Batch Operations
# Basic transaction
DB.transaction do
user = User.create(name: 'Transaction User', email: '[email protected]', age: 30)
post = user.add_post(title: 'Transaction Post', content: 'Content...')
post.add_comment(content: 'Comment', user: user)
end
# Automatic rollback on exception
begin
DB.transaction do
user = User.create(name: 'Test User', email: '[email protected]', age: 25)
# Intentionally cause an error
invalid_post = user.add_post(title: '', content: '') # Validation error
end
rescue Sequel::ValidationFailed => e
puts "Transaction was rolled back: #{e.message}"
end
# Manual rollback
DB.transaction(rollback: :reraise) do |conn|
user = User.create(name: 'Conditional User', email: '[email protected]', age: 30)
if some_condition_fails?
raise Sequel::Rollback, "Condition was not met"
end
user.add_post(title: 'Conditional Post', content: 'Content...')
end
# Batch operations
class BatchOperations
def self.bulk_insert_users(user_data_array)
DB.transaction do
user_data_array.each_slice(1000) do |batch|
User.multi_insert(batch)
end
end
end
def self.bulk_update_users(updates)
DB.transaction do
updates.each do |user_id, attributes|
User[user_id].update(attributes)
end
end
end
def self.efficient_data_processing
# Memory-efficient large data processing
User.paged_each(rows_per_fetch: 1000) do |user|
# Process each user
process_user(user)
end
end
# Utilizing prepared statements
def self.optimized_batch_insert(data_array)
insert_statement = DB[:users].prepare(:insert, :insert_user,
name: :$name, email: :$email, age: :$age)
DB.transaction do
data_array.each do |data|
insert_statement.call(data)
end
end
end
private
def self.process_user(user)
# Actual processing logic
puts "Processing user: #{user.name}"
end
def self.some_condition_fails?
# Some condition check
false
end
end
Migrations and Schema Management
# Migration example
Sequel.migration do
up do
create_table :categories do
primary_key :id
String :name, null: false, unique: true
String :description, text: true
DateTime :created_at, default: Sequel::CURRENT_TIMESTAMP
end
alter_table :posts do
add_foreign_key :category_id, :categories
add_index :category_id
end
end
down do
alter_table :posts do
drop_column :category_id
end
drop_table :categories
end
end
# Complex schema changes
Sequel.migration do
change do
create_table :user_profiles do
primary_key :id
foreign_key :user_id, :users, null: false, unique: true
String :bio, text: true
String :website
String :location
TrueClass :public_profile, default: true
DateTime :created_at, default: Sequel::CURRENT_TIMESTAMP
DateTime :updated_at
index :user_id
index :public_profile
end
# Existing table changes
alter_table :users do
add_column :is_active, TrueClass, default: true
add_column :last_login_at, DateTime
add_index :is_active
end
end
end
# Custom schema methods
class SchemaManager
def self.setup_initial_schema
DB.create_table! :users do
primary_key :id, type: :Bignum
String :name, size: 100, null: false
String :email, size: 255, null: false, unique: true
Integer :age
TrueClass :is_active, default: true
DateTime :created_at, default: Sequel::CURRENT_TIMESTAMP
DateTime :updated_at
index :email, unique: true
index [:is_active, :created_at]
end
DB.create_table! :posts do
primary_key :id, type: :Bignum
String :title, size: 200, null: false
String :content, text: true
foreign_key :user_id, :users, type: :Bignum, null: false
DateTime :published_at
DateTime :created_at, default: Sequel::CURRENT_TIMESTAMP
DateTime :updated_at
index :user_id
index :published_at
index [:user_id, :published_at]
end
end
def self.add_full_text_search
DB.run <<~SQL
CREATE INDEX posts_content_fts ON posts
USING gin(to_tsvector('english', title || ' ' || content));
SQL
end
end
Plugin System and Customization
# Custom plugin creation
module Sequel
module Plugins
module Sluggable
module InstanceMethods
def generate_slug
self.slug = name.downcase.gsub(/[^a-z0-9]+/, '-').gsub(/-+/, '-').chomp('-')
end
def to_param
slug || id.to_s
end
end
module ClassMethods
def find_by_slug(slug)
where(slug: slug).first
end
end
end
end
end
# Plugin usage
class Category < Sequel::Model
plugin :sluggable
plugin :timestamps
plugin :validation_helpers
def validate
super
validates_presence :name
validates_unique :name
generate_slug if name_changed?
end
end
# Rich built-in plugin utilization
class Article < Sequel::Model
# Commonly used plugins
plugin :timestamps # Automatic created_at, updated_at management
plugin :validation_helpers # Validation helpers
plugin :json_serializer # JSON conversion
plugin :dirty # Change tracking
plugin :caching # Caching
plugin :pagination # Pagination
plugin :tree # Hierarchical structure
plugin :nested_attributes # Nested attributes
# Associations
many_to_one :user
many_to_one :category
one_to_many :comments
many_to_many :tags
def validate
super
validates_presence [:title, :content, :user_id]
validates_min_length 5, :title
end
# dirty plugin utilization
def track_changes
if column_changed?(:title)
puts "Title changed from '#{initial_value(:title)}' to '#{title}'"
end
end
# pagination plugin utilization
def self.paginated_articles(page = 1, per_page = 10)
paginate(page, per_page)
end
end
# Connection pooling and performance optimization
class DatabaseOptimizer
def self.configure_connection_pool
DB.pool.after_connect = proc do |conn|
# Post-connection configuration
case DB.database_type
when :postgres
conn.exec("SET statement_timeout = '30s'")
conn.exec("SET lock_timeout = '10s'")
when :mysql
conn.query("SET sql_mode = 'STRICT_TRANS_TABLES'")
end
end
end
def self.connection_pool_stats
{
size: DB.pool.size,
max_size: DB.pool.max_size,
allocated: DB.pool.allocated,
available: DB.pool.available
}
end
# Query logging and performance monitoring
def self.enable_query_logging
DB.loggers << Logger.new($stdout)
# Slow query monitoring
DB.extension :query_literals
DB.log_warn_duration = 1.0 # Warn about queries taking over 1 second
end
end