Giter Site home page Giter Site logo

logidze's Introduction

Cult Of Martians Gem Version Build Open Source Helpers

Logidze

Logidze provides tools for logging DB records changes when using PostgreSQL. Just like audited and paper_trail do (but faster).

Logidze allows you to create a DB-level log (using triggers) and gives you an API to browse this log. The log is stored with the record itself in JSONB column. No additional tables required.

🤔 How is Logidze pronounced?

Other requirements:

  • Ruby ~> 2.7
  • Rails >= 6.0 (for Rails 4.2 use version <=0.12.0, for Rails 5.x use version <= 1.2.3)
  • PostgreSQL >= 10.0
Sponsored by Evil Martians

Links

Table of contents

Installation

Add Logidze to your application's Gemfile:

gem "logidze", "~> 1.1"

Install required DB extensions and create trigger function:

bundle exec rails generate logidze:install

This creates a migration for adding trigger function and enabling the hstore extension.

Run migrations:

bundle exec rails db:migrate

NOTE: Logidze uses DB functions and triggers, hence you need to use SQL format for a schema dump:

# application.rb
config.active_record.schema_format = :sql

Using with schema.rb

Logidze seamlessly integrates with fx gem to make it possible to continue using schema.rb for the database schema dump.

Add fx gem to your Gemfile and run the same Logidze generators: rails g logidze:install or rails g logidze:model.

If for some reason Logidze couldn't detect the presence of Fx in your bundle, you can enforce it by passing --fx option to generators.

On the other hand, if you have fx gem but don't want Logidze to use it—pass --no-fx option.

Configuring models

Run the following migration to enable changes tracking for an Active Record model and adding a log_data::jsonb column to the table:

bundle exec rails generate logidze:model Post
bundle exec rails db:migrate

This also adds has_logidze line to your model, which adds methods for working with logs.

By default, Logidze tries to infer the path to the model file from the model name and may fail, for example, if you have unconventional project structure. In that case, you should specify the path explicitly:

bundle exec rails generate logidze:model Post --path "app/models/custom/post.rb"

Backfill data

To backfill table data (i.e., create initial snapshots) add backfill option to the generator:

bundle exec rails generate logidze:model Post --backfill

Now your migration should contain and UPDATE ... statement to populate the log_data column with the current state.

Otherwise a full snapshot will be created the first time the record is updated.

You can create a snapshot manually by performing the following query:

UPDATE <my_table> as t
SET log_data = logidze_snapshot(to_jsonb(t))

Or by using the following methods:

Model.create_logidze_snapshot

# specify the timestamp column to use for the initial version (by default the current time is used)
Model.create_logidze_snapshot(timestamp: :created_at)

# filter columns
Model.create_logidze_snapshot(only: %w[name])
Model.create_logidze_snapshot(except: %w[password])

# or call a similar method (but with !) on a record

my_model = Model.find(params[:id])
my_model.create_logidze_snapshot!(timestamp: :created_at)

A snapshot is only created if log_data is null.

Log size limits

You can provide the limit option to generate to limit the size of the log (by default it's unlimited):

bundle exec rails generate logidze:model Post --limit=10

Tracking only selected columns

You can log only particular columns changes. There are mutually exclusive except and only options for this:

# track all columns, except `created_at` and `active`
bundle exec  rails generate logidze:model Post --except=created_at,active
# track only `title` and `body` columns
bundle exec rails generate logidze:model Post --only=title,body

Logs timestamps

By default, Logidze tries to get a timestamp for a version from record's updated_at field whenever appropriate. If your model does not have that column, Logidze will gracefully fall back to statement_timestamp().

To change the column name or disable this feature completely, you can use the timestamp_column option:

# will try to get the timestamp value from `time` column
bundle exec rails generate logidze:model Post --timestamp_column time
# will always set version timestamp to `statement_timestamp()`
bundle exec rails generate logidze:model Post --timestamp_column nil # "null" and "false" will also work

Undoing a Generated Invocation

If you would like to re-do your rails generate anew, as with other generators you can use rails destroy to revert it, which will delete the migration file and undo the injection of has_logidze into the model file:

bundle exec rails destroy logidze:model Post

IMPORTANT: If you use non-UTC time zone for Active Record (config.active_record.default_timezone), you MUST always infer log timestamps from a timestamp column (e.g., when back-filling data); otherwise, you may end up with inconsistent logs (#199). In general, we recommend using UTC as the database time unless there is a very strong reason not to.

Using with partitioned tables

Logidze supports partitioned tables for PostgreSQL 13+ without any additional configuration. For PostgreSQL 11/12, you should use after triggers. To do that, provide the --after-trigger option to the migration:

bundle exec rails generate logidze:model Post --after-trigger

NOTE: Record changes are written as a full snapshot if the partition has changed during the update.

IMPORTANT: Using Logidze for partitioned tables in PostgreSQL 10 is not supported.

Usage

Basic API

Your model now has log_data column, which stores changes log.

To retrieve record version at a given time use #at or #at! methods:

post = Post.find(27)

# Show current version
post.log_version #=> 3

# Show log size (number of versions)
post.log_size #=> 3

# Get copy of a record at a given time
post.at(time: 2.days.ago)

# or revert the record itself to the previous state (without committing to DB)
post.at!(time: "2018-04-15 12:00:00")

# If no version found
post.at(time: "1945-05-09 09:00:00") #=> nil

You can also get revision by version number:

post.at(version: 2)

NOTE: If log_data is nil, #at(time:) returns self and #at(version:) returns nil. You can opt-in to return nil for time-based #at as well by setting Logidze.return_self_if_log_data_is_empty = false.

It is also possible to get version for relations:

Post.where(active: true).at(time: 1.month.ago)

You can also get diff from specified time:

post.diff_from(time: 1.hour.ago)
#=> { "id" => 27, "changes" => { "title" => { "old" => "Logidze sucks!", "new" => "Logidze rulz!" } } }

# the same for relations
Post.where(created_at: Time.zone.today.all_day).diff_from(time: 1.hour.ago)

NOTE: If log_data is nil, #diff_from returns an empty Hash as "changes".

Also, it is possible to retrieve list of model's versions:

post.logidze_versions # => Enumerator

# you can use Enumerator's #take to return all
post.logidze_versions.take

# or you take a few or call any Enumerable method
post.logidze_versions.take(2)
post.logidze_versions.find do
  _1.title == "old title"
end

# we can also add options
post.logidze_versions(reverse: true) # from newer to older
post.logidze_versions(include_self: true) # returns self as the last record or the first one when `reverse` is set to true

There are also #undo! and #redo! options (and more general #switch_to!):

# Revert record to the previous state (and stores this state in DB)
post.undo!

# You can now user redo! to revert back
post.redo!

# More generally you can revert record to arbitrary version
post.switch_to!(2)

You can initiate reloading of log_data from the DB:

post.reload_log_data # => returns the latest log data value

Typically, if you update record after #undo! or #switch_to! you lose all "future" versions and #redo! is no longer possible. However, you can provide an append: true option to #undo! or #switch_to!, which will create a new version with old data. Caveat: when switching to a newer version, append will have no effect.

post = Post.create!(title: "first post") # v1
post.update!(title: "new title") # v2
post.undo!(append: true) # v3 (with same attributes as v1)

Note that redo! will not work after undo!(append: true) because the latter will create a new version instead of rolling back to an old one. Alternatively, you can configure Logidze always to default to append: true.

Logidze.append_on_undo = true

Track meta information

You can store any meta information you want inside your version (it could be IP address, user agent, etc.). To add it you should wrap your code with a block:

Logidze.with_meta({ip: request.ip}) do
  post.save!
end

NOTE: You should pass metadata as a Hash; passing keyword arguments doesn't work in Ruby 3.0+.

Meta expects a hash to be passed so you won't need to encode and decode JSON manually.

By default .with_meta wraps the block into a DB transaction. That could lead to an unexpected behavior, especially, when using .with_meta within an around_action. To avoid wrapping the block into a DB transaction use transactional: false option.

Logidze.with_meta({ip: request.ip}, transactional: false) do
  post.save!
end

Important: If you use connection pooling (e.g., PgBouncer), using .with_meta without a transaction may lead to unexpected results (since meta is set for a connection). Without a transaction, we cannot guarantee that the same connection will be used for queries (including metadata cleanup).

Important: In Rails, after_commit callbacks are executed after transaction is committed, and, thus, after with_meta block is executed—the meta wouldn't be added to changes captured in the after_commit phase. One particular scenario is having associations with touch: true (touch updates are executed after commit).

Track responsibility

A special application of meta information is storing the author of the change, which is called Responsible ID. There is more likely that you would like to store the current_user.id that way.

To provide responsible_id you should wrap your code in a block:

Logidze.with_responsible(user.id) do
  post.save!
end

And then to retrieve responsible_id:

post.log_data.responsible_id

Logidze does not require responsible_id to be SomeModel ID. It can be anything. Thus Logidze does not provide methods for retrieving the corresponding object. However, you can easily write it yourself:

class Post < ActiveRecord::Base
  has_logidze

  def whodunnit
    id = log_data.responsible_id
    User.find(id) if id.present?
  end
end

And in your controller:

class ApplicationController < ActionController::Base
  around_action :use_logidze_responsible, only: %i[create update]

  def use_logidze_responsible(&block)
    Logidze.with_responsible(current_user&.id, &block)
  end
end

By default .with_responsible wraps the block into a DB transaction. That could lead to an unexpected behavior, especially, when using .with_responsible within an around_action. To avoid wrapping the block into a DB transaction use transactional: false option.

Logidze.with_responsible(user.id, transactional: false) do
  post.save!
end

Disable logging temporary

If you want to make update without logging (e.g., mass update), you can turn it off the following way:

Logidze.without_logging { Post.update_all(seen: true) }

# or

Post.without_logging { Post.update_all(seen: true) }

Reset log

Reset the history for a record (or records):

# for a single record
record.reset_log_data

# for relation
User.where(active: true).reset_log_data

Full snapshots

You can instruct Logidze to create a full snapshot instead of a diff for a particular log entry.

It could be useful in combination with .without_logging: first, you perform multiple updates without logging, then you want to create a log entry with the current state. To do that, you should use the Logidze.with_full_snapshot method:

record = Model.find(params[:id])

Logidze.without_logging do
  # perform multiple write operations with record
end

Logidze.with_full_snapshot do
  record.touch
end

Associations versioning

Logidze also supports associations versioning. This feature is disabled by default (due to the number of edge cases). You can learn more in the wiki.

Dealing with large logs

By default, Active Record selects all the table columns when no explicit select statement specified.

That could slow down queries execution if you have field values which exceed the size of the data block (typically 8KB). PostgreSQL turns on its TOAST mechanism), which requires reading from multiple physical locations for fetching the row's data.

If you do not use compaction (generate logidze:model ... --limit N) for log_data, you're likely to face this problem.

Logidze provides a way to avoid loading log_data by default (and load it on demand):

class User < ActiveRecord::Base
  # Add `ignore_log_data` option to macros
  has_logidze ignore_log_data: true
end

If you want Logidze to behave this way by default, configure the global option:

# config/initializers/logidze.rb
Logidze.ignore_log_data_by_default = true

# or

# config/application.rb
config.logidze.ignore_log_data_by_default = true

However, you can override it by explicitly passing ignore_log_data: false to the ignore_log_data. You can also enforce loading log_data in-place by using the .with_log_data scope, e.g. User.all.with_log_data loads all the users with log_data included.

The chart below shows the difference in PG query time before and after turning ignore_log_data on. (Special thanks to @aderyabin for sharing it.)

If you try to call #log_data on the model loaded in such way, you'll get nil. If you want to fetch log data (e.g., during the console debugging)–use user.reload_log_data, which forces loading this column from the DB.

Handling records deletion

Unlike, for example, PaperTrail, Logidze is designed to only track changes. If the record has been deleted, everything is lost.

If you want to keep changes history after records deletion as well, consider using specialized tools for soft-delete, such as, Discard or Paranoia.

See also the discussion: #61.

Handling PG exceptions

By default, Logidze raises an exception which causes the entire transaction to fail. To change this behavior, it's now possible to override logidze_capture_exception(error_data jsonb) function.

For example, you may want to raise a warning instead of an exception and complete the transaction without updating log_data.

Related issues: #193

Upgrading

We try to make an upgrade process as simple as possible. For now, the only required action is to create and run a migration:

bundle exec rails generate logidze:install --update

This updates core logdize_logger DB function. No need to update tables or triggers.

NOTE: When using fx, you can omit the --update flag. The migration containing only the updated functions would be created.

If you want to update Logidze settings for the model, run migration with --update flag:

bundle exec rails generate logidze:model Post --update --only=title,body,rating

You can also use the --name option to specify the migration name to avoid duplicate migration names:

$ bundle exec rails generate logidze:model Post --update --only=title,body,rating --name add_only_filter_to_posts_log_data

    create db/migrate/20202309142344_add_only_filter_to_posts_log_data.rb

Pending upgrade check [Experimental]

Logidze can check for a pending upgrade. Use Logidze.on_pending_upgrade = :warn to be notified by warning, or Logidze.on_pending_upgrade = :raise if you want Logidze to raise an error.

Upgrading from 0.x to 1.0 (edge)

Schema and migrations

Most SQL function definitions have changed without backward compatibility. Perform the following steps to upgrade:

  1. Re-install Logidze: bundle exec rails generate logidze:install --update.

  2. Re-install Logidze triggers for all models: bundle exec rails generate logidze:model <model> --update.

    NOTE: If you had previously specified whitelist/blacklist attributes, you will need to include the --only/--except option as appropriate. You can easily copy these column lists from the previous logidze migration for the model.

  3. Remove the include Logidze::Migration line from the old migration files (if any)—this module has been removed.

Rewrite legacy logidze migrations to not use the #current_setting(name) and #current_setting_missing_supported? methods, or copy them from the latest 0.x release.

API changes

The deprecated time positional argument has been removed from #at and #diff_from methods. Now you need to use keyword arguments, i.e., model.at(some_tome) -> model.at(time: some_time).

Log format

The log_data column has the following format:

{
  "v": 2, // current record version,
  "h": // list of changes
    [
      {
        "v": 1,  // change number
        "ts": 1460805759352, // change timestamp in milliseconds
        "c": {
            "attr": "new value",  // updated fields with new values
            "attr2": "new value"
            },
        "r": 42, // Resposibility ID (if provided), not in use since 0.7.0
        "m": {
          "_r": 42 // Resposibility ID (if provided), in use since 0.7.0
          // any other meta information provided, please see Track meta information section for the details
        }
      }
    ]
}

If you specify the limit in the trigger definition, then log size will not exceed the specified size. When a new change occurs, and there is no more room for it, the two oldest changes will be merged.

Troubleshooting

log_data is nil when using Rails fixtures

Rails fixtures are populated with triggers disabled. Thus, log_data is null initially for all records. You can use #create_logidze_snapshot manually to build initial snapshots.

How to make this work with Apartment 🤔

First, read Apartment docs on installing PostgreSQL extensions. You need to use the described approach to install Hstore (and drop the migration provided by Logidze during installation).

Secondly, set config.use_sql = true in the Apartment configuration.

Finally, when using fx along with schema.rb, you might face a problem with duplicate trigger definitions (for different schemas). Here is a patch to fix this: dump_triggers.rake.

Related issues: #50.

PG::UntranslatableCharacter: ERROR

That could happen when your row data contain null bytes. You should sanitize the data before writing to the database. From the PostgreSQL docs: jsonb type also rejects \u0000 (because that cannot be represented in PostgreSQL's text type).

Related issues: #155.

pg_restore fails to restore a dump

First, when restoring data dumps you should consider using --disable-triggers option (unless you have a strong reason to invoke the triggers).

When restoring data dumps for a particular PostgreSQL schema (e.g., when using Apartment), you may encounter the issue with non-existent Logidze functions. That happens because pg_dump adds SELECT pg_catalog.set_config('search_path', '', false);, and, thus, breaks our existing triggers/functions, because they live either in "public" or in a tenant's namespace (see this thread).

PG::NumericValueOutOfRange: ERROR: value overflows numeric format

Due to the usage of hstore_to_jsonb_loose under the hood, there could be a situation when you have a string representing a number in the scientific notation (e.g., "557236406134e62000323100"). Postgres would try to convert it to a number (a pretty big one, for sure) and fail with the exception.

Related issues: #69.

Development

This project requires a PostgreSQL instance running with the following setup:

# For testing
createdb -h postgres -U postgres logidze_test

# For benchmarks
createdb -h postgres -U postgres logidze_bench
createdb -h postgres -U postgres logidze_perf_bench
psql -d logidze_bench -c 'CREATE EXTENSION IF NOT EXISTS hstore;'

This project is compatible with Reusable Docker environment setup.

Contributing

Bug reports and pull requests are welcome on GitHub at https://github.com/palkan/logidze.

License

The gem is available as open source under the terms of the MIT License.

logidze's People

Contributors

aki77 avatar akxcv avatar artplan1 avatar baygeldin avatar bf4 avatar cavi21 avatar charlie-wasp avatar codetriage-readme-bot avatar danielmklein avatar depfu[bot] avatar dmitrytsepelev avatar duderman avatar feipinghuang avatar frexuz avatar hsbt avatar jcsrb avatar kpg-ideon avatar miharekar avatar oleg-kiviljov avatar palkan avatar petergoldstein avatar sadfuzzy avatar skryukov avatar smaximov avatar sparlaimor avatar sponomarev avatar swrobel avatar tagirahmad avatar vassilevsky avatar zocoi avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

logidze's Issues

Unable to load version using ActiveRecord `updated_at`

If I understood the docs, the timestamp is derived from the updated_at timestamp at the record level. In my case, I have these set by ActiveRecord, so they are always available.

If I try to load a version at(time: <record.updated_at>), I am getting back nil. Somehow, the timestamps seem to be off by 1 millisecond after inspecting log_data. Simply adding 1 to the timestamp will yield the expected result, but I feel I am missing something here.

I simply want to grab the version based on the recorded updated_time. Any tips would be appreciated, thanks!

Not working with factory girl gem

Its working fine in development and production mode. But when running in testing where factory_girl gem is loaded, it casing errors. Check the stack for more details.

undefined local variable or method `has_logidze' for #<Class:0x007fa12bcc8700> (NameError)
/Users/ankur/.rvm/gems/ruby-2.2.2@project/gems/activerecord-5.0.0/lib/active_record/dynamic_matchers.rb:21:in `method_missing'
/Users/ankur/dev/project/project/app/models/bot.rb:2:in `<class:Bot>'
/Users/ankur/dev/project/project/app/models/bot.rb:1:in `<top (required)>'
/Users/ankur/.rvm/gems/ruby-2.2.2@project/gems/activesupport-5.0.0/lib/active_support/dependencies.rb:293:in `require'
/Users/ankur/.rvm/gems/ruby-2.2.2@project/gems/activesupport-5.0.0/lib/active_support/dependencies.rb:293:in `block in require'
/Users/ankur/.rvm/gems/ruby-2.2.2@project/gems/activesupport-5.0.0/lib/active_support/dependencies.rb:259:in `load_dependency'
/Users/ankur/.rvm/gems/ruby-2.2.2@project/gems/activesupport-5.0.0/lib/active_support/dependencies.rb:293:in `require'
/Users/ankur/.rvm/gems/ruby-2.2.2@project/gems/activesupport-5.0.0/lib/active_support/dependencies.rb:380:in `block in require_or_load'
/Users/ankur/.rvm/gems/ruby-2.2.2@project/gems/activesupport-5.0.0/lib/active_support/dependencies.rb:37:in `block in load_interlock'
/Users/ankur/.rvm/gems/ruby-2.2.2@project/gems/activesupport-5.0.0/lib/active_support/dependencies/interlock.rb:12:in `block in loading'
/Users/ankur/.rvm/gems/ruby-2.2.2@project/gems/activesupport-5.0.0/lib/active_support/concurrency/share_lock.rb:117:in `exclusive'
/Users/ankur/.rvm/gems/ruby-2.2.2@project/gems/activesupport-5.0.0/lib/active_support/dependencies/interlock.rb:11:in `loading'
/Users/ankur/.rvm/gems/ruby-2.2.2@project/gems/activesupport-5.0.0/lib/active_support/dependencies.rb:37:in `load_interlock'
/Users/ankur/.rvm/gems/ruby-2.2.2@project/gems/activesupport-5.0.0/lib/active_support/dependencies.rb:358:in `require_or_load'
/Users/ankur/.rvm/gems/ruby-2.2.2@project/gems/activesupport-5.0.0/lib/active_support/dependencies.rb:511:in `load_missing_constant'
/Users/ankur/.rvm/gems/ruby-2.2.2@project/gems/activesupport-5.0.0/lib/active_support/dependencies.rb:203:in `const_missing'
/Users/ankur/dev/project/project/spec/factories/bots.rb:2:in `block in <top (required)>'
/Users/ankur/.rvm/gems/ruby-2.2.2@project/gems/factory_girl-4.5.0/lib/factory_girl/syntax/default.rb:49:in `instance_eval'
/Users/ankur/.rvm/gems/ruby-2.2.2@project/gems/factory_girl-4.5.0/lib/factory_girl/syntax/default.rb:49:in `run'
/Users/ankur/.rvm/gems/ruby-2.2.2@project/gems/factory_girl-4.5.0/lib/factory_girl/syntax/default.rb:7:in `define'
/Users/ankur/dev/project/project/spec/factories/bots.rb:1:in `<top (required)>'

I have a factory

FactoryGirl.define do
  factory :bot, class: Bot do |f|
    f.name 'Bot'
    f.description { Faker::Lorem.paragraph }

    factory :public_bot do |f|
      f.remote_avatar_url { Faker::Avatar.image }
    end

  end
end

Compress changes into the last version on demand

Thanks for the speedy work on #93!

The recent debounce changes are welcome, but I'm wondering if we could add a feature to do this on demand. I have a background job which generates a table of contents for a document after it's been saved. Since the table of contents links to headers within the document, this job needs to modify the document itself. Currently, the changes this job makes to the document are ignored by Logidze, but it would be better if I could tell Logidze to lump a set of changes into the most recent version in log_data, in a block. It would also aid in generating reasonable history when importing a document.

Sequel support

Hi! It would be awesome if this gem supported not only ActiveRecord, but Sequel, too. I fiddled around with it the other day and it seems that providing Sequel support isn't hard.
What do you think about it? Should we make an adapter that would detect what abstraction is being used or should we make a separate gem for Sequel?

PG::UndefinedObject - ERROR: unrecognized configuration parameter "logidze.disabled"

I am using rails 5.0.0, with production and development server.

When I take a pull from production server to my local server, the configuration for logidze.disabled is always missing probably due to role specific config are ignored during exporting/importing.

I even add a rake a task which will set this config variable, but still, I am facing issues in development, whenever I am updating any model who have logidze installed.

namespace :db_sync do
  desc "Post import task"
  task :post_import do
    ActiveRecord::Base.connection.execute("SET logidze.disabled TO off; select current_setting('logidze.disabled')")
  end
end

Any suggestions ?

Options whitelist and blacklist with a long list of attributes

Hi,

First of all, I have to say that I'm really happy using this gem. Just wanted to note that when using a long list of column names to whitelist (63 names on a table with 109 columns), it parsed bad the list and included all the columns for the involved model.
I have solved it on my particular case by writing the column list to blacklist directly on the postgres function, on the migration generated.
But maybe it could be solved if the option --blacklist= (and it's white equivalent) work with a comma separated list, and between quotes, like:

--whitelist='first_column_name,second_column_name'

I'm sure that this way we would escape the shell behaviour.

Thanks,

Question: do limits not apply to records with more than X log_data entries?

Looking at the migration that is generated, it looks like row collapsing only occurs when the number of log entries exactly matches the set row limit.

For example, if a limit of 30 is set, any record that has less than 30 log_data entries would work as expected; however, if a record already has 31+ entries, it would not collapse future entries and instead continues to add new entries as if there were no limit.

Is this a good enhancement opportunity?

Query by responsible_id

Let's say we have something like:

Logidze.with_responsible(user.id) do
  Post.create
end

It would be really useful to be able to find records by their responsible_id:

Post.where(log_data: { responsible_id: user.id })

Otherwise to be able to do this you have to add a real responsible_id column on each of the models and duplicate the with_responsible logic with a slight twist.

Since log_data is JSONB, I think something like this (pseudo-code) is doable, but it requires to know the structure of log_data:

Post.where("(log_data -> 'h' @> '[r ->> ?]')", user.id)

Support for ruby >2.1

Is there any plan and/or roadmap to support ruby >2.1? As requirements states only ~>2.1
Is this because there are some hard dependencies on ruby 2.1?

Actually, I am using latest ruby patch level 2.4.1 and I am looking for a logging mechanism. Logidze sounds cooler than others. But this specific requirement is a road block for me(and for others too of-course).

Let me know if I can help in any form for this issue.

Add user_id who changed

Great gem! Is there a way to add the user_id that is responsible for the change, via current_user is present?

`association_versioning` with `ignore_log_data: true` causes `MissingAttributeError`

I'm taking logidze for a spin and it seems to be a great fit for our needs, but I've encountered what appears to be a bug. With association_versioning and ignore_log_data turned on, after retrieving a copy of the model at a given point in time using at, I'm then unable to retrieve the associated records, instead receiving the error ActiveModel::MissingAttributeError: missing attribute: log_data.

Ruby version: 2.3.5
ActiveRecord version: 4.2.10

The issue vanishes if I either turn off ignore_log_data on the associated model (ie. the Comment model in the example below) or disable association versioning. So far I haven't found a way to work around the issue.

Steps to reproduce, based on the association tracking example:

Migrations

# Post
class CreatePosts < ActiveRecord::Migration
  def change
    create_table :posts do |t|
      t.string :title
      t.string :content
      t.timestamps null: false
    end
  end
end
# Comment
class CreateComments < ActiveRecord::Migration
  def change
    create_table :comments do |t|
      t.string :body
      t.references :post, index: true, foreign_key: true
      t.timestamps null: false
    end
  end
end

Models

class Post < ActiveRecord::Base
  has_logidze ignore_log_data: true
  has_many :comments
end
class Comment < ActiveRecord::Base
  has_logidze ignore_log_data: true
  belongs_to :post
end

Code to reproduce

[24] pry(main)> p = Post.create(title: "New Post", content: "Bug hunting")
=> #<Post:0x00007fb7ac642c18
 id: 4,
 title: "New Post",
 content: "Bug hunting",
 created_at: Mon, 11 Mar 2019 12:28:08 UTC +00:00,
 updated_at: Mon, 11 Mar 2019 12:28:08 UTC +00:00,
 log_data: nil>
[25] pry(main)> p = Post.with_log_data.last
=> #<Post:0x00007fb7ad2bbd78
 id: 4,
 title: "New Post",
 content: "Bug hunting",
 created_at: Mon, 11 Mar 2019 12:28:08 UTC +00:00,
 updated_at: Mon, 11 Mar 2019 12:28:08 UTC +00:00,
 log_data:
  #<Logidze::History:0x00007fb7afa619e0
   @data=
    {"h"=>
      [{"c"=>
         {"id"=>4,
          "title"=>"New Post",
          "content"=>"Bug hunting",
          "created_at"=>"2019-03-11T12:28:08.089259",
          "updated_at"=>"2019-03-11T12:28:08.089259"},
        "v"=>1,
        "ts"=>1552307288089}],
     "v"=>1}>>
[26] pry(main)> p.comments.create(body: "First comment")
=> #<Comment:0x00007fb7af89a620
 id: 3,
 body: "First comment",
 post_id: 4,
 created_at: Mon, 11 Mar 2019 12:28:46 UTC +00:00,
 updated_at: Mon, 11 Mar 2019 12:28:46 UTC +00:00,
 log_data: nil>
[27] pry(main)> p.at(time: Time.now)
=> #<Post:0x00007fb7ad2bbd78
 id: 4,
 title: "New Post",
 content: "Bug hunting",
 created_at: Mon, 11 Mar 2019 12:28:08 UTC +00:00,
 updated_at: Mon, 11 Mar 2019 12:28:08 UTC +00:00,
 log_data:
  #<Logidze::History:0x00007fb7afa619e0
   @data=
    {"h"=>
      [{"c"=>
         {"id"=>4,
          "title"=>"New Post",
          "content"=>"Bug hunting",
          "created_at"=>"2019-03-11T12:28:08.089259",
          "updated_at"=>"2019-03-11T12:28:08.089259"},
        "v"=>1,
        "ts"=>1552307288089}],
     "v"=>1},
   @versions=
    [#<Logidze::History::Version:0x00007fb7af92ba30
      @data=
       {"c"=>
         {"id"=>4,
          "title"=>"New Post",
          "content"=>"Bug hunting",
          "created_at"=>"2019-03-11T12:28:08.089259",
          "updated_at"=>"2019-03-11T12:28:08.089259"},
        "v"=>1,
        "ts"=>1552307288089}>]>>
[28] pry(main)> p.comments.length
ActiveModel::MissingAttributeError: missing attribute: log_data
from /Users/dmills/.rvm/gems/ruby-2.3.5@yesware/gems/activerecord-4.2.10/lib/active_record/attribute_methods/read.rb:93:in `block in _read_attribute'

I hope that helps. Let me know if I can provide any additional details that might be useful. And thanks for the great work on this library!

Migrate from papertrail?

I currently use papertrail, and I want to delete its massively verbose table, but I don't want to lose the history.

Is there a way I can migrate my papertrail versions over to logidze?

Columns filtering

What

We need a way to restrict versioning to a subset of columns.
These can be done in 2 ways: through whitelisting and blacklisting.

Why

To reduce the size of the log data.

How

Extend logidze_logger function (and related functions) to support 2 more arguments, except and only. Then CREATE TRIGGER migration looks like:

execute <<-SQL
  CREATE TRIGGER logidze_on_users
  BEFORE UPDATE OR INSERT ON users FOR EACH ROW
  WHEN (current_setting('logidze.disabled') <> 'on')
  EXECUTE PROCEDURE logidze_logger(null, null, '{name, role, email, phone}');
SQL

Where the first argument is a limit (as it is now), the second argument is a blacklist of columns, and the third argument is a whitelist of columns.

We should be able to specify these parameters through generator script:

rails generate logidze:model Post --only="title,user_id,tags"

rails generate logidze:model Post --except="created_at,updated_at"

We also need a way to upgrade existing triggers (maybe through the --upgrade flag).

latest release (v0.10.0) breaks db tests

Tell us about your environment

Ruby Version: ruby 2.6.3p62 (2019-04-16 revision 67580) [x86_64-darwin18]

Rails Version: Rails 6.0.0.rc1

PostgreSQL Version: PostgreSQL 10.6 on x86_64-pc-linux-musl, compiled by gcc (Alpine 8.2.0) 8.2.0, 64-bit

Logidze Version: v0.10.0

What did you do?

Running tests that rely on test fixtures caused a bunch of failures that were seemingly unrelated to the tests being ran.

What did you expect to happen?

Tests would continue to pass.

What actually happened?

I have a feeling that this may be related to parallel tests and Rails stock test fixtures.

I kept my app on v0.9.0 and also got some recent console errors while trying to set a responsible_id in an ActiveJob. The error output was WARNING: SET LOCAL can only be used in transaction blocks

I realize this isn't a very actionable error, but wanted to bring it up in case others ran into something similar. Feel free to close if it's not helpful.

Removing columns breaks older logs

Possibly related to #59
We have this migration:

class ChangeIsAgentToAccountType < ActiveRecord::Migration[5.2]
  def up
    add_column :users, :account_type, :integer, default: 0, null: false

    User.all.each do |user|
      user.update(account_type: user.is_agent ? User.account_types[:agent] : User.account_types[:user])
    end

    remove_column :users, :is_agent
  end

  def down
    add_column :users, :is_agent, :boolean, default: false, null: false

    User.all.each do |user|
      user.update(is_agent: user.account_type_agent? ? true : false)
    end

    remove_column :users, :account_type
  end
end

so, is_agent column is now gone, but it is still in the log data.

In our admin, we print all changes for a specific record:

panel 'Version history' do
  versions = []
  log_size = resource.log_size
  (1..log_size).each do |version|
    #versions << resource.at(version: version)
  end
  table_for versions.reverse! do
    column('Version', &:log_version)
    column('Change') { |resource| log_data_change(resource) }
    column('Actions') { |resource| link_to 'Restore', "#{resource.id}/restore?log_version=#{resource.log_version}", method: :put, data: { confirm: t('are_you_sure') } }
  end
end

Error: can't write unknown attribute is_agentAt:resource.at(version: version)`

Stacktrace:

activemodel (5.2.0) lib/active_model/attribute.rb:207:in `with_value_from_database'
activemodel (5.2.0) lib/active_model/attribute_set.rb:57:in `write_from_user'
activerecord (5.2.0) lib/active_record/attribute_methods/write.rb:51:in `_write_attribute'
activerecord (5.2.0) lib/active_record/attribute_methods/write.rb:45:in `write_attribute'
logidze (0.6.3) lib/logidze/model.rb:206:in `apply_column_diff'
logidze (0.6.3) lib/logidze/model.rb:198:in `block in apply_diff'
logidze (0.6.3) lib/logidze/model.rb:197:in `each'
logidze (0.6.3) lib/logidze/model.rb:197:in `apply_diff'
logidze (0.6.3) lib/logidze/model.rb:211:in `build_dup'
logidze (0.6.3) lib/logidze/model.rb:112:in `at_version'
logidze (0.6.3) lib/logidze/model.rb:70:in `at'

Seems like it could be fixed by the gem. Is there any temporary fix we could do?

UPDATE:
Temporary fix: resource.log_data.data['h'].map{|l| l['c'].delete('is_agent') }

Empty Entries With Black Listing

I track a position in a list and the status of data display in a table row.

execute <<-SQL CREATE TRIGGER logidze_on_brokerage_transactions BEFORE UPDATE OR INSERT ON brokerage_transactions FOR EACH ROW WHEN (coalesce(#{current_setting('logidze.disabled')}, '') <> 'on') EXECUTE PROCEDURE logidze_logger(100, 'updated_at', '{created_at, position, stale}'); SQL

The position and stale get updated a lot - and they don't actually matter to the audit. I ended up with a bunch of records that just have the updated_at as the changed field. I then change it to:
EXECUTE PROCEDURE logidze_logger(100, '', '{created_at, position, updated_at, stale}');

Now I have a bunch of change entries with no entries in them. Basically I'm trying to only store changes if there is a change in a field that isn't created_at, position or stale. What should it look like?

Are race conditions an issue?

If I have two processes, one which intends to update the log_data (model b), and one wants to disable logging (model a), would this scenario be possible:

disable logging to save a
start saving b
start saving a
finish saving b
finish saving a
re-enable logging after saving a

b's log wasn't updated?

Functionality and performance questions, and approval recommendation

I apologize in advance if these questions don't belong here, I'm quite curious about the functionality and performance of this gem. I'm mainly asking this questions from the viewpoint that most of my records would be accessing the most recent version only, and very rarely needing to review past changes.

  1. It's difficult to tell from the documentation, does this save the original record, then apply changes every time it is loaded? Or does it save the most recent and contain a log of all past changes? According to the documentation it looks like the former ("attr": "new value" is mentioned in the log format). If the former, what kind of performance impact is there when attempting to view the most current version of a record (e.g. 10 versions vs 1000 versions when attempting to load a single record)?
  2. In regards to question 1, if the most recent version is saved and all past changes are logged in the log_data column would there be any noticable benefit to saving the log_data in a separate table, then creating a relationship table between log_data and the table you want to track? e.g. An "Event" has the most recent version saved, as the most recent version is saved it creates an entry in EventLogsRelationship, and the the Event's previous data is saved to an EventLogs entry. I understand this is NOT how the gem works now, I'm mainly curious if there would be a potential performance increase with this method, specifically because it wouldn't load the log data unless it's called.
  3. Is it possible to add an "Approval" feature where someone such as an Admin or Mod must approve changes prior to them being implemented? I feel like this would be similar in some ways to the "Cooldown Time" someone suggested.

Time on the Log not saving correctly

Hi small issue came to mind when looking into the gem for possible use.

When making the change timestamps are saving correctly but time on the log_data seems to be off by ~48000 years.

Record version:

 {"c"=>{"updated_at"=>"2018-06-25 15:18:31.22996"},
    "r"=>"37",
    "v"=>3,
     "ts"=>1529939911230}>

Converting ts
Time.at(version.time) gives 50451-11-12 20:40:30 +0000

redoing with append breaks carrierwave

Hi, I haven't been able to debug much yet but wanted to report this. I enabled logidze for a model with a carrierwave field. I'm showing a table with all revisions and providing a button to undo them, so I wanted to use append: true.

While model.undo! and model.redo! work correctly, model.redo! append: true makes carrierwave fail. Apparently it's re-running the upload process without a file attached to it, so it can't read any of it's attributes (in my particular setup if can't find the extension to create the filename).

I'll update this issue with my findings :)

Question: Is there a way to manually trigger a version?

I'm thinking maybe if I want to manually handle versioning, I'd disable logging globally, and then whenever I want to create a version, call:

MyModel.connection.execute(%(
  EXECUTE PROCEDURE logidze_snapshot(to_jsonb(#{attributes}), '{excluded columns}')
))

?

ActiveRecord::RangeError: PG::NumericValueOutOfRange: ERROR: value overflows numeric format

I'm having a problem with any string that appears to be in scientific notation. For instance '557236406134e62000323100'.

The specific error is:

ActiveRecord::RangeError: PG::NumericValueOutOfRange: ERROR:  value overflows numeric format
CONTEXT:  PL/pgSQL function logidze_logger() line 68 at assignment

Postgres version: 9.6

Edit: I have my suspicions that it is because of the to_jsonb method.

Returns the value as json or jsonb. Arrays and composites are converted (recursively) to arrays and objects; otherwise, if there is a cast from the type to json, the cast function will be used to perform the conversion; otherwise, a scalar value is produced. For any scalar type other than a number, a Boolean, or a null value, the text representation will be used, in such a fashion that it is a valid json or jsonb value.

I believe to_jsonb is erroneously attempting to typecast the string, could this be a bug in PG?

Unmentioned downside for object deletion

This library looks very interesting and promising but one downside I see by storing the snapshots in the same level as the record is that you are totally unaware of any delete operations.

Example I want to keep an exact log of a join association between two records. Using logidze I'm unable to see any historical changes between the two models as there is no way to see any removed associations.

Or I'm missing something?

How to get associations at version?

Currently you need to specify a timestamp to get associations processed by logidze. If you specify a version (at: version) it will not set logidze_requested_ts and therefore will skip the logidze processing.

I tried some code to set the logidze_requested_ts when using at version but it has issues. I can create an incomplete PR for review.

Handle schema changes

We do not handle such cases, for example, as adding or removing columns. That might lead to undesired behaviour (see #58).

Some thoughts.

When a new column has been added, we might have no initial value for it (i.e. in the first version). Hence for the version prior to adding the column we should (either):

  • nullify it
  • set to default value (if any)
  • raise an error when it's read (thus mimic the old behaviour–undefined column).

Not sure which one is the right way.

Make ingore_log_data globally configurable

Let's make it possible to set ignore_log_data globally instead of per-model:

# application.rb
Rails.application.config.logidze.select_ignore_log_data = true

NOTE: if we explicitly provide ingore_log_data: false with has_logidze, then it overrides the global setting.

Also, let's add an API to temporary change this global setting:

# admin/application_controller.rb
class Admin::ApplicationController < AC::Base
  around_action :enable_logidze_log_data

  def enable_logidze_log_data
    Logidze.with_ignore_log_data(false) { yield }
  end
end

Add support to store metadata

Logidze is really awesome but it can't replace papertrail in a project I'm working on.
This kind of stuff does not belong to the resource itself, it would be more pleasant to store it alongside the log.

# application_controller.rb
def info_for_paper_trail
  { ip: request.remote_ip }
end

# model.rb
attr_accessor :comment
[...]
has_paper_trail meta: { reason: comment }

Add API to nullify log_data

Sometimes it might be necessary to reset the history for a record (or records) (e.g. with some GDPR-related stuff).

Let's add an API to do that. Something like this:

# for single record
record.reset_log_data #=> which is, probably, equal to Logidze.without_logging { record.update_column(:log_data, nil) }

# for relation
User.where(...).reset_log_data #=> Logidze.without_logging { relation.update_all(log_data: nil) }

How to turn on Associations versioning

So I saw in the wiki that association versioning is available and I need that feature. It has in there this:

This feature is considered experimental, so it's disabled by default. You can turn it on by setting Logidze.associations_versioning to true. Associations versioning works with belongs_to and has_many associations

Where do I turn on the associations_versioning setting and set it to true?

Complex types diffs

Related to #30

The problem

The log doesn't respect complex data types such as jsonb, array, hstore, etc (due to the usage of hstore under the hood. The values for these types are just strings.

We have to manually cast them back to appropriate Ruby structure.

Example:

post = Post.create!(tags: ['some', 'tag']) # tags is an array column

post.udpate!(tags: ['other'])

post.reload.diff_from(...)
#=> {"old"=>{"tags"=>["some", "tag"]}, "new"=>"{\"tags\": [\"other\"]}"}

# but should be

#=> {"old"=>{"tags"=>["some", "tag"]}, "new"=>{"tags"=>["other"]}}

The problem also may affect the at methods.

Trouble loading active_model/type/value in Rails 5.2.0.beta2

Issue: I'm getting this error when I try to load the logidze gem in Rails 5.2.0.beta2.

Error:

There was an error while trying to load the gem 'logidze'. (Bundler::GemRequireError)
Gem Load Error is: uninitialized constant ActiveModel::Type::Value

Where it happens: /lib/logidze/history/type.rb:2

What solved it for me:

ActiveSupport.on_load(:active_record) do
  require 'active_model/type/value'
end

Link: https://github.com/MatiasFMolinari/logidze/blob/df9e885078517b3107e817979be1be50d2c56c75/lib/logidze/history/type.rb#L2

Source:
I don't fully understand how the loading changed in this new rails version. I stumbled upon the solution on this commit: https://github.com/iaankrynauw/paranoia/commit/49b9e68ee79d8a8e5fa04add1e93b9d69ccf9315

Feature request: saving association changes

What

Sometimes it is really handy to save data describing changes in associated objects, but there is no gem for that. PaperTrail tries to do that, but that's an experimental feature, which is not working very well.

Why

If i had something like:

class User < ActiveRecord::Base
  has_many :attachments

end

class Attachment < ActiveRecord::Base
  belongs_to :user

end

and user uploads a new attachment, there is not going to be meaningful log data that i can use later.
Best case scenario: i'll get log_data with timestamps (which are going to be blacklisted most likely) as the only changed attribute.
But i'd like to have an ability to build a changelog or to revert to any previous version of User including matching versions of associated objects.

How

Unfortunately, i don't have a clear singular solution, hence feature request, instead of pull request.
If i had to implement an ad-hoc solution, i'd just serialize every associated object and treat them as nested attributes, but i am not really sure if that's the best way of doing that.

Removing specific versions from the histroy

I am currently working on a way to clean up the history as I did not use the without_logging wisely in the past in regards to some maintenance tasks and now what to expose part of the history to users.

My approach is to remove the specific version from the json and updating the number of the other versions.

Is this something that would make sense to added to the gem?
Are there any obvious issues with my approach?

Add cooldown time

Is it possible to add next feature? I set time since last updated time when last version will be overwrite without creating a new version. When time is out: new version will be created like current behavior. It will be good for autosaving.

Deployment on Heroku

First of all, awesome gem!

I just tried to deploy to Heroku, and the migration returned this error.

ActiveRecord::StatementInvalid: PG::InsufficientPrivilege: ERROR: permission denied to set parameter "logidze.disabled" CONTEXT: SQL statement "ALTER DATABASE ***DB_NAME*** SET logidze.disabled TO off"

Any ideas on how this might be fixed?

Support with_meta without transactions

Problem

Currently, wrapping execution with Logidze.with_meta (or Logidze.with_responsible) call also wraps everything into a DB transaction, i.e.:

Logidze.with_meta(data) do
  do_something
end

# structurally equal to
ActiveRecord::Base.transaction do
  set_logidze_meta(data)
  do_something
  reset_logidze_meta
end

That could lead to an unexpected behavior, especially, when using .with_responsible within an around_action hook: all the DB operations happened within the action could be rollbacked in case of the exception. Thus, wrapping everything into .with_responsible significantly changes the way the application works.

Potential Solution

We need to investigate the possibility of removing the transaction from .with_meta. The reason why it was used is to guarantee the state at the end of the block at the DB side (that's the way SET LOCAL works in transactions).

If it's possible then we need to upgrade the current behavior by adding a transactional: true|false option for .with_meta/.with_responsible methods with true by default.

Also, we might consider adding a global switch (Logidze.transactional_meta = true | false).

Discussion: Is SQL Format _really_ a requirement?

I understand why, currently, this is a requirement. I'm not questioning current/past limitations, just thinking about future possibilities.

I was looking at the internals of another gem (Scenic) I use (for managing materialized views in particular) and they are able to handle raw SQL-based migration/schema statements without the hard requirement of having to use SQL formatted db/schema.rb

##Digging deeper, and looking at a working example
Scenic seem to have a helper that gets exposed to the AR::Schema define block named create_view (or create_materialized_view) that gets called, passing off the SQL as a HEREDOC block param.
(source code of helpers https://github.com/scenic-views/scenic/blob/b51ed69a223440076a5fa2d48bc5d7764e450eb9/lib/scenic/statements.rb)

These in turn (scenic has a db-adapter abstraction layer) gets passed of the the adapter, each on having their own special way of handling the actual execution of the statement. e.g. Scenic.database.create_view(name, sql_definition)

def create_materialized_view(name, sql_definition, no_data: false)
        raise_unless_materialized_views_supported
        execute <<-SQL
  CREATE MATERIALIZED VIEW #{quote_table_name(name)} AS
  #{sql_definition}
  #{'WITH NO DATA' if no_data};
        SQL
      end

Obviously there is some filler here that Logidze would not need, but is the general idea one we could use: Supply a help for the schema/migration system to use to create the trigger/s

Is there something else I'm missing? It has been a bit since I really dove deep on logidze's codebase so maybe I'm not seeing another blocker to this general idea.

We would have to have a helper that would essentially take over the job that currently is being done by the generator, but we have that functionality, namely logidze knows the SQL needed for it's triggers given a model name, so theoretically, we should be able to drop the hard requirement of SQL Formatted schema files and instead of running a generator, we'd write a vanilla migration with a helper, such as logidze_model :post, blacklist: [:created_at, :active], timestamp_column: nil

Not sure what this looks like for updating the definition, some food for thought.

This all said, I don't think that doing so will directly, dynamically allow anything new or interesting, other than lowering the bar for alternative database support, such as MySQL/Maria, Redshift, etc.

Trying to get Logidze working

Hi, I am having some issues getting Logidze working. I have followed your install instructions and whenever I try to create I get the following error

PG::UndefinedFunction - ERROR:  function to_jsonb(positions) does not exist
LINE 1: SELECT logidze_snapshot(to_jsonb(NEW.*), ts_column, columns_...
                                ^
HINT:  No function matches the given name and argument types. You might need to add explicit type casts.
QUERY:  SELECT logidze_snapshot(to_jsonb(NEW.*), ts_column, columns_blacklist)
CONTEXT:  PL/pgSQL function logidze_logger() line 20 at assignment

Any ideas?

I am using the Apartment gem which is tenanting my customers using postgres schemas, perhaps this is the problem?

model_file_path should use Rails' autoloader, rather than assume a file location.

Currently, I get the error 'No such file or directory @ rb_sysopen - path/app/models/underscored_model_name.rb'

This is because of https://github.com/palkan/logidze/blob/master/lib/generators/logidze/model/model_generator.rb#L91

I propose that instead, 'UnderscoredModelName'.constantize should be used, so the file path can be retrieved via:

[20] pry(main)> ap name = 'StepDefinitions::Base'; 
klass = name.constantize; 

(klass.methods - ActiveRecord::Base.methods).select{ |meth| 
  location_path = klass.method(meth).source_location[0]; 
  location_path.include?('/app/') && location_path.ends_with?("#{name.underscore}.rb") 
}.map{ |meth| 
  [
    meth, klass.method(meth).source_location
  ] 
}

=> [
  [
    :excluding_type,     
    [
      "/project_path/app/models/data/step_definitions/base.rb", 
      149
    ]
  ], 
  [
    :types,     
    [
      "/project_path/app/models/data/step_definitions/base.rb", 
      145
    ]
  ]
]

For discussion: mysql adapter

Hi,

I wonder, what prevents logidze from mysql support? Is there some technical reasons, or it's just a lack of time and no need of it right now?

I think, with mysql support this gem can address much more people's problems.

Field added after the requested version is exempt from versioning

I start with some object

$ o = MyModel.create(a: 1, b: 2)

I add column c to MyModel in a migration

$ ap o
<MyModel: 0x00000> {
  :a => 1,
  :b => 2,
  :c => nil,
  :log_data => #<Logidze....>
}

I change the value of c

$ o.update(c: 3)

I request the initial version and expect to see the default value of c

$ o.at_version(1).c
3

Instead I see the most recent value of c.

Is this expectation reasonable? Is there a way to support this behavior if not by default then through configuration?

log_size should return 0 for nil

This is easy to step around, but right now if a record is created within a Logidze.without_logging block, the log_data will be nil. That's expected, but it would be nice if log_size returned 0 instead of this error:

Module::DelegationError: Document#log_size delegated to log_data.size, but log_data is nil

Add ignore_columns integration

From: #89 (comment)

The idea is the following: we want to reduce the data loaded from DB by not loading log_data column by default (using ignored_columns feature), and only load log_data when explicitly specified (e.g. via with_log_data scope).

The proposed API:

class User < ActiveRecord::Base
  has_logidze ignore_log_data: true
end

User.all #=> SELECT id, name FROM users

User.with_log_data #=> SELECT id, name, log_data FROM users

(optional, maybe, another PR) Add an easier way to access log_data on a single record, e.g.:

user = User.find(params[:id])
user.log_data #=> raises (or nil?) since column is ignored
user.log_data! #=> loads log_data from DB

Using Logidze with an unconventional project structure

I've installed logidze in a project without a conventional structure and I'm receiving an exception

class Project < Shared::ApplicationRecord
    has_logidze

undefined local variable or method `has_logidze' for Clients::Project(Table doesn't exist):Class

when I move has_logidze to a line after the the table name is set in the model I receive a slightly different exception

class Project < Shared::ApplicationRecord
    self.table_name = 'projects'
    has_logidze

undefined local variable or method `has_logidze' for #Class:0x00007fbb160a04c8

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.