Giter Site home page Giter Site logo

shrinerb / shrine Goto Github PK

View Code? Open in Web Editor NEW
3.1K 38.0 269.0 13.61 MB

File Attachment toolkit for Ruby applications

Home Page: https://shrinerb.com

License: MIT License

Ruby 98.96% CSS 0.06% Dockerfile 0.02% JavaScript 0.96%
file-upload ruby rack storage attachment filesystem s3 background-jobs direct-upload orm metadata

shrine's Introduction

Shrine logo: a red paperclip

Shrine is a toolkit for handling file attachments in Ruby applications. Some highlights:

If you're curious how it compares to other file attachment libraries, see the Advantages of Shrine. Otherwise, follow along with the Getting Started guide.

Links

Resource URL
Website & Documentation shrinerb.com
Demo code Roda / Rails
Wiki github.com/shrinerb/shrine/wiki
Discussion forum github.com/shrinerb/shrine/discussions
Alternate Discussion forum discourse.shrinerb.com

Setup

Run:

bundle add shrine

Then add config/initializers/shrine.rb which sets up the storage and loads ORM integration:

require "shrine"
require "shrine/storage/file_system"

Shrine.storages = {
  cache: Shrine::Storage::FileSystem.new("public", prefix: "uploads/cache"), # temporary
  store: Shrine::Storage::FileSystem.new("public", prefix: "uploads"),       # permanent
}

Shrine.plugin :activerecord           # loads Active Record integration
Shrine.plugin :cached_attachment_data # enables retaining cached file across form redisplays
Shrine.plugin :restore_cached_data    # extracts metadata for assigned cached files

Next, add the <name>_data column to the table you want to attach files to. For an "image" attachment on a photos table this would be an image_data column:

$ rails generate migration add_image_data_to_photos image_data:text # or :jsonb

If using jsonb consider adding a gin index for fast key-value pair searchability within image_data.

Now create an uploader class (which you can put in app/uploaders) and register the attachment on your model:

class ImageUploader < Shrine
  # plugins and uploading logic
end
class Photo < ActiveRecord::Base
  include ImageUploader::Attachment(:image) # adds an `image` virtual attribute
end

In our views let's now add form fields for our attachment attribute that will allow users to upload files:

<%= form_for @photo do |f| %>
  <%= f.hidden_field :image, value: @photo.cached_image_data, id: nil %>
  <%= f.file_field :image %>
  <%= f.submit %>
<% end %>

When the form is submitted, in your controller you can assign the file from request params to the attachment attribute on the model:

class PhotosController < ApplicationController
  def create
    Photo.create(photo_params) # attaches the uploaded file
    # ...
  end

  private

  def photo_params
    params.require(:photo).permit(:image)
  end
end

Once a file is uploaded and attached to the record, you can retrieve the file URL and display it on the page:

<%= image_tag @photo.image_url %>

See the Getting Started guide for further documentation.

Inspiration

Shrine was heavily inspired by Refile and Roda. From Refile it borrows the idea of "backends" (here named "storages"), attachment interface, and direct uploads. From Roda it borrows the implementation of an extensible plugin system.

Similar libraries

  • Paperclip
  • CarrierWave
  • Dragonfly
  • Refile
  • Active Storage

Contributing

Please refer to the contributing page.

Code of Conduct

Everyone interacting in the Shrine project’s codebases, issue trackers, and mailing lists is expected to follow the Shrine code of conduct.

License

The gem is available as open source under the terms of the MIT License.

shrine's People

Contributors

aglushkov avatar alpaca-tc avatar aried3r avatar benkoshy avatar funkyloverone avatar hmistry avatar janko avatar jeanluis019 avatar jrochkind avatar lizdeika avatar lordofthedanse avatar mokolabs avatar nkzsdy avatar okuramasafumi avatar ordinathorreur avatar printercu avatar reidab avatar renchap avatar richardvenneman avatar silvenon avatar sliminas avatar tatsuyafw avatar texpert avatar thibaudgg avatar thomasklemm avatar tmaier avatar uysim avatar v-kolesnikov avatar y-yagi avatar ypresto avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

shrine's Issues

Remote url plugin can't find some files

Hi,
I found out that while this url can be opened fine, the remote url plugin will not grab it.

https://trello-attachments.s3.amazonaws.com/551edb81eda0610c6fd2d322/718x485/0ac204696b33b9eba1744aaa938ded42/large_audience_reach_twitter_%5B1%5D.png

Mongoid Support?

Hey,

Does shrine supports mongoid as an ORM.

If not, would you consider a PR?

Can't create multiple versions of image

I am trying to create two new versions of the uploaded image, one large and one small.

What I found is, the large is always the same size as thumb.

require "image_processing/mini_magick"

class AttachmentUploader < Shrine
  include ImageProcessing::MiniMagick

  plugin :activerecord
  plugin :logging, logger: Rails.logger

  plugin :determine_mime_type
  plugin :store_dimensions
  plugin :direct_upload, max_size: 20*1024*1024, presign: -> (r) do
    r.params["fields"] || {}
  end
  plugin :remove_attachment
  plugin :versions, names: [:original, :large, :thumb]

  def process(io, context)
    if context[:phase] == :store
      file = io.download
      large = resize_to_limit!(file, 1200, 1200)
      thumb = resize_to_limit!(file, 300, 300)

      {
        original: io, 
        large: large,
        thumb: thumb,
      }
    end
  end
end

The data behind this:

pp JSON.parse(Upload.last.attachment_data)
  Upload Load (0.5ms)  SELECT  "uploads".* FROM "uploads"  ORDER BY "uploads"."id" DESC LIMIT 1
{"original"=>
  {"id"=>"3a0768ce990db7637c54126d634a3e2ebee35ee6d3546bf3fba599f46939.jpg",
   "storage"=>"store",
   "metadata"=>
    {"filename"=>"Interstellar UHD 2.jpg",
     "size"=>1691252,
     "mime_type"=>"image/jpeg",
     "width"=>3840,
     "height"=>2160}},
 "large"=>
  {"id"=>"21c487c57e193eae3650025b453b0669ee083603621af0879823f454025c.jpg",
   "storage"=>"store",
   "metadata"=>
    {"filename"=>"Interstellar UHD 2.jpg",
     "size"=>14306,
     "mime_type"=>"image/jpeg",
     "width"=>300,
     "height"=>169}},
 "thumb"=>
  {"id"=>"7bd60d5ac34e0cc8164067571c36914588135bc87673f9f9b68660277ee7.jpg",
   "storage"=>"store",
   "metadata"=>
    {"filename"=>"Interstellar UHD 2.jpg",
     "size"=>14306,
     "mime_type"=>"image/jpeg",
     "width"=>300,
     "height"=>169}}}

_enforce_io doesn't work with method_missing

     NameError:
       undefined method `read' for class `Rack::Test::UploadedFile'
     # /home/_/.bundle/ruby/2.3.0/shrine-9c7ff26556fa/lib/shrine.rb:364:in `method'
     # /home/_/.bundle/ruby/2.3.0/shrine-9c7ff26556fa/lib/shrine.rb:364:in `block in _enforce_io'

Rack::Test::UploadedFile source:
https://github.com/brynary/rack-test/blob/master/lib/rack/test/uploaded_file.rb

      def method_missing(method_name, *args, &block) #:nodoc:
        @tempfile.__send__(method_name, *args, &block)
      end

      def respond_to?(method_name, include_private = false) #:nodoc:
        @tempfile.respond_to?(method_name, include_private) || super
      end

Solution:

Do not check method parameters.
Remove this check: && [a.count, -1].include?(io.method(m).arity)

Background helpers with ActiveJob

When trying to use the background_helpers with ActiveJob, I got an error saying "Unsupported argument type: Symbol". Through inspection I found that Shrine is passing in the name of the attachment as a symbol, to the promote block, and ActiveJob does not like that.

I ended up doing this:

Shrine::Attacher.promote do |data| 
  data["attachment"] = data["attachment"].to_s
  UploadJob.perform_later(data) 
end
class UploadJob < ActiveJob::Base
  queue_as :default

  def perform(data)
    data["attachment"] = data["attachment"].to_sym
    Shrine::Attacher.promote(data)
  end
end

Rails 3

i used on rails 4 and worked very well, but on rails 3 when i upload file shows:

<ActionDispatch::Http::UploadedFile:0x007fd93b01baf0 @original_filename="dededee.png", @content_type="image/png", @headers="Content-Disposition: form-data; name="user[avatar]"; filename="dededee.png"\r\nContent-Type: image/png\r\n", @tempfile=#Tempfile:/var/folders/4p/79mlwbyd6830qsq2x30wtspm0000gn/T/RackMultipart20151105-5296-1bybw4o> is not a valid IO object (it doesn't respond to eof?(), close())

so, is not compatible with rails 3?

How does shrine works with non-image files?

Hi, I haven't found much info on non image files.

I am going to use pre-signed s3 urls so there is no way I can tell upfront if the file is image or not and I want all kids of files uploaded.

Hopefully shrine will work in my case, is there anything to keep in mind in this use-case?

plugin method: use const_defined?

defined? is undesirable, because defined? can reference toplevel constants.
Use plugin.const_defined?(___, false)

You use defined? method in plugin method:

        def plugin(plugin, *args, &block)
          plugin = Plugins.load_plugin(plugin) if plugin.is_a?(Symbol)
          plugin.load_dependencies(self, *args, &block) if plugin.respond_to?(:load_dependencies)
          include(plugin::InstanceMethods) if defined?(plugin::InstanceMethods)
          extend(plugin::ClassMethods) if defined?(plugin::ClassMethods)
          self::UploadedFile.include(plugin::FileMethods) if defined?(plugin::FileMethods)
          self::UploadedFile.extend(plugin::FileClassMethods) if defined?(plugin::FileClassMethods)
          self::Attachment.include(plugin::AttachmentMethods) if defined?(plugin::AttachmentMethods)
          self::Attachment.extend(plugin::AttachmentClassMethods) if defined?(plugin::AttachmentClassMethods)
          self::Attacher.include(plugin::AttacherMethods) if defined?(plugin::AttacherMethods)
          self::Attacher.extend(plugin::AttacherClassMethods) if defined?(plugin::AttacherClassMethods)
          plugin.configure(self, *args, &block) if plugin.respond_to?(:configure)
          nil
        end
[1] pry(main)> defined?(Hash::String)
"constant"
[2] pry(main)> Hash.const_defined?(:String, false)
false
[4] pry(main)> Hash::String
(pry):4: warning: toplevel constant String referenced by Hash::String

PDF to image conversion timeout

I've managed to get PDF conversion of the first page to an image to work with the following in process.

require "image_processing/mini_magick"

class ImageUploader < Shrine
  MAX_IMAGE_SIZE_MB = 50

  include ImageProcessing::MiniMagick

  plugin :determine_mime_type
  plugin :remove_attachment
  plugin :store_dimensions
  plugin :validation_helpers
  plugin :versions, names: [:original, :thumb, :large]
  plugin(:default_url) { |_| '/img/preview-not-available.jpg' }

  Attacher.validate do
    validate_max_size MAX_IMAGE_SIZE_MB.megabytes, message: "is too large (max is #{MAX_IMAGE_SIZE_MB} MB)"
    validate_mime_type_inclusion ['image/jpeg', 'image/png', 'image/gif', 'application/pdf']
  end

  def process(io, context)
    case context[:phase]
      when :store
        if io.mime_type == 'application/pdf'
          # NOTE: `convert!` calls `format` which defaults to copying page 0 only
          png_file = convert!(io.download, "png")
          thumb = resize_to_limit(png_file, 200, 200)
        else
          thumb = resize_to_limit!(large, 200, 200)
        end

        {original: io, thumb: thumb}
    end
  end
end

However, for a relatively larger PDF (600+ pages), this code is timing out on Rails after about 1 minute. Attempting imagemagick manually on the command line works in about 1-2 seconds.

$ convert "emacs.pdf[0]" emacs.png

Am I doing something wrong?

Related to #52 in that it's the same scenario.

PDF: https://www.gnu.org/software/emacs/manual/pdf/emacs.pdf

Is the `parsed_json?` string-keys check necessary?

I'm using Shrine with a framework that automatically parses JSON query params, so the parsed_json plugin looked like a good fit. However, since the framework presents the parsed JSON with symbol string, this check requiring String keys fails.

Is this check really necessary, or is there a particular edge case that it's trying to avoid?

Another option — Looking a bit deeper in the call path for assign: assign_cached calls uploaded_file, which accepts Hash arguments. Is the Hash -> JSON String -> Hash round trip that parsed_json offers worth it, or can AttacherMethods#assign just handle Hashes with the same code path used for Strings?

File URL with Filesystem storage misses a starting /

Hi !

First of all thanks for this gem, it looks very promising and I really prefer your approach compared to other I used.

I am using the Filesystem storage for now, and I find is disturbing that calling url on a file outputs something like uploads/image/42/content/57940ab85648f2c416bfabc0e478ebe75233690d67abbac93b86db11f8bf.jpg

This is not a valid URL as it is relative and I need to add a / in front of it in all my views (or override url in all uploaders). I could specify a host, but I would find it much more consistent to have Shrine prefix it with /, especially with the subfolder parameter.

What do you think about it ?

Getting error ERR_CONNECTION_REFUSED on direct upload in rails

I cloned the rails example app https://github.com/janko-m/shrine-rails-example/tree/improvements

I have done everything as per the readme but when uploading a file am getting this error on the browser console.

Cross-Origin Request Blocked: The Same Origin Policy disallows reading
the remote resource at
https://example-bucket.s3.ap-southeast.amazonaws.com/. (Reason: CORS
request failed).

Here is my cors settings

<?xml version="1.0" encoding="UTF-8"?>
<CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
    <CORSRule>
        <AllowedOrigin>http://localhost:3000</AllowedOrigin>
        <AllowedMethod>GET</AllowedMethod>
        <AllowedMethod>POST</AllowedMethod>
        <AllowedMethod>PUT</AllowedMethod>
        <AllowedHeader>*</AllowedHeader>
    </CORSRule>
</CORSConfiguration>

Could someone tell me whats wrong here?

Status of chunking and resuming uploads

I've been searching through the documentation, but I didn't find anything about chunking and resuming of uploads. I'm asking, because I plan to use shrine with the fineuploader. There is a pretty nice documentation, what the backend should be capable of in here: http://docs.fineuploader.com/endpoint_handlers/traditional.html

So my question is, what should I do with UUID from the fineuploader? Does Shrine support chunking? How about resuming of uploads?

Thank you in advance for your answer :-)

cached_attachment_data is funky when validations fail

There are two scenarios:

  1. Choose a file that passes validations but the record itself has other validation errors. The file selected is stuck, can't upload a different file (upon submit the previous file is still assigned)

  2. Add a file that fails validations. Same result. Can't even change it for a valid file.

If I remove the cached_logo_data attribute from the strong parameters filter, then the file is no longer stuck (one can change it), but if the record has validation errors, upon fixing those and resubmitting (without changing the already valid file), the file is then not sent.

I'm banging my head trying some conditionals while manually setting the cached_attachment_attribute but can't figure out what causes this yet :(

Error after update model with avatar record.

PS1: Sorry for my poor english
PS2: Send image for cache before submit (direct: true with javascript code).

View:

<%= f.hidden_field :avatar, value: @user.cached_avatar_data %>
<%= f.input :avatar, as: :file, label: false, input_html: { direct: true }, wrapper: false %>

Logger

Started PUT "/users/profile_photo" for 172.17.0.8 at 2016-06-29 21:42:22 +0000
  Auth Load (7.6ms)  SELECT  "users".* FROM "users" WHERE "users"."id" = $1  ORDER BY "users"."id" ASC LIMIT 1  [["id", 44]]
Processing by UsersController#update as HTML
  Parameters: {"utf8"=>"✓", "authenticity_token"=>"ArHlpzPOHOhR/4rm1vG319HDSFjkAc+nhUYj7a5nJSGxXAYLlH8T2Meeg1XaHiGqUOcC8XueJt+YyaNiiHNFqA==", "user"=>{"avatar"=>"{\"id\":\"5920c9aacb.jpg\",\"storage\":\"cache\",\"metadata\":{\"filename\":\"file.jpg\",\"size\":36713,\"mime_type\":\"image/jpeg\",\"width\":640,\"height\":480}}"}, "commit"=>"Next", "id"=>"profile_photo"}
   (3.4ms)  BEGIN
  SQL (8.5ms)  UPDATE "users" SET "avatar_data" = $1, "steps" = $2, "updated_at" = $3 WHERE "users"."id" = $4  [["avatar_data", nil], ["steps", 3], ["updated_at", "2016-06-29 21:42:25.136930"], ["id", 23]]
   (4.9ms)  ROLLBACK
Completed 500 Internal Server Error in 1473ms (ActiveRecord: 32.0ms)

ActiveRecord::StatementInvalid (PG::NotNullViolation: ERROR:  null value in column "avatar_data" violates not-null constraint
DETAIL:  Failing row contains (23, 44, Fulano, da Silva, 1991-08-05, 1, cachinxola, 00777022397, 0, 0, USD, null, null, 1, 3, 0, 2016-06-29 13:55:56.291964, 2016-06-29 21:42:25.13693).
: UPDATE "users" SET "avatar_data" = $1, "steps" = $2, "updated_at" = $3 WHERE "users"."id" = $4):
  app/controllers/user_controller.rb:32:in `update'

Model:

include PictureUploader[:avatar]

PictureUploader

class PictureUploader < Shrine
  include ImageProcessing::MiniMagick
  plugin :versions, names: [:original, :large, :medium, :small, :thumbnail]

  plugin :activerecord
  plugin :determine_mime_type
  plugin :logging, logger: Rails.logger
  plugin :remove_attachment
  plugin :store_dimensions
  plugin :validation_helpers
  plugin :direct_upload, max_size: 10*1024*1024 # 10 MB
  plugin :parsed_json
  plugin :direct_upload, presign: true
  plugin :pretty_location
  plugin :cached_attachment_data
  plugin :rack_file

  plugin :default_url do |context|
    "/fallback/album.png"
  end

   Attacher.validate do
    validate_max_size 10.megabytes, message: 'is too large (max is 10 MB)'
    validate_mime_type_inclusion ['image/jpg', 'image/jpeg', 'image/png', 'image/gif']
   end

   def process(io, context)
    case context[:phase]
    when :store
      large      = resize_to_limit!(io.download, 800, 600)
      medium     = resize_to_limit!(io.download, 500, 500)
      small      = resize_to_limit!(io.download, 100, 100)
      thumbnail  = resize_to_limit!(io.download, 50, 50)
      { original: io, large: large, medium: medium, small: small, thumbnail: thumbnail }
    end
  end
end

Migration

t.json :avatar_data, null: false, default: '{}'

Controller

current_user.update_attributes(profile_photo_params)

Strong Parameters

def profile_photo_params
  params.require(:user).permit(avatar: [:id, :storage, metadata: [:filename, :size, :mime_type, :width, :height]])
  end

Issue when trying to upload cached s3 file

I'm trying to do @user.avatar = params[:user][:avatar] on an avatar file that was uploaded via jquery.fileupload. The params look like this:
{"user"=>{"avatar"=>{"id"=>"f52c607d73a78858532eefdd303ffa8685d5771edd821a9e256dbd88322a", "storage"=>"cache", "metadata"=>{"size"=>"97874", "filename"=>"skylanders_swap_force_ps4-01.jpg", "mime_type"=>"image/jpeg"}}}}

The problem is that the gem can't pick this up as an io:
Shrine::InvalidFile: {"id"=>"f52c607d73a78858532eefdd303ffa8685d5771edd821a9e256dbd88322a", "storage"=>"cache", "metadata"=>{"size"=>"97874", "filename"=>"skylanders_swap_force_ps4-01.jpg", "mime_type"=>"image/jpeg"}} is not a valid IO object (it doesn't respond toread(length, outbuf),eof?(),rewind(),close()) from /Users/razvanciocanel/.rvm/gems/ruby-2.1.5@rails4/bundler/gems/shrine-39c045174f49/lib/shrine.rb:357:in_enforce_io'`

This worked with version 0.9.0 but now it does now with version 1.0.0.

What changed?


Other details:

shrine.rb file:

require "shrine"
require "shrine/storage/file_system"
require "shrine/storage/s3"
require "shrine/plugins/activerecord"

s3_options = {
  access_key_id:     ENV['AWS_ACCESS_KEY_ID'],      # "xyz"
  secret_access_key: ENV['AWS_SECRET_ACCESS_KEY'],  # "abc"
  region:            ENV['AWS_REGION'],             # "eu-west-1"
  bucket:            ENV['AWS_BUCKET'],             # "my-app"
}

if Rails.env.development?
  s3_options.merge!({
    endpoint:           ENV["AWS_ENDPOINT"],
    host:               "#{ENV["AWS_ENDPOINT"]}/#{ENV["AWS_BUCKET"]}/",
    force_path_style:   true
  })
end

Shrine.storages = {
  cache: Shrine::Storage::S3.new(prefix: "cache", **s3_options),
  store: Shrine::Storage::S3.new(prefix: "store", **s3_options),
}

Shrine.plugin :direct_upload, presign: true, max_size: nil #20*1024*1024
Shrine.plugin :background_helpers

Shrine::Attacher.promote { |data| ShrineUploadJob.perform_async(data) }
Shrine::Attacher.delete { |data| ShrineDeleteJob.perform_async(data) }

Empty files after version processing

Hi.

Basically, I come up with empty files after processing the file and returning versions by the example. I think the issue is that the files are not rewinded after being passed to MiniMagick. Since it works if I rewind the file cursor, or pass a new tempfile to one of the MiniMagick processing methods. I guess adding a rewind call for the versioned files when storing should do the trick?

Rubinius does not like "multipart: {file: image}" in tests.

Rubinius does not like "multipart: {file: image}" in shrine's tests.

We are getting following error 9+1 (10) times in one test file, direct_upload_test.rb, on Travis:

To reproduce it, I change my rvm locally to "rvm install rbx-2.9" and ran "bundle"/"rake" and got same results.

You can also run this: "rake test TEST=test/plugin/direct_upload_test.rb".

Question: Is there another way to specify the "multipart: {file: image}" in this file: test/plugin/direct_upload_test.rb"?

Here is the line in the rack-test_app gem: lib/rack/test_app.rb:230: * v.is_a?(::File) ? mp.add_file(k, v) : mp.add(k, v.to_s)

sending data to model instances

hey Janko-m,

Sorry to bother you again. I have another question about Shrine.
The code that you graciously gave to me is working very well.
I’m confused about how the included plugin is working. When I go to save
an instance of my model(in this case track) through it’s controllers’ #create action.
is there a way that shrine can know that the uploaded file is associated with the Track instance being saved?

here is the code again

class AudioUploader < Shrine
  MAX_FILESIZE = 50.megabytes

  MIME_TYPES = %w(audio/mpeg audio/mp3 audio/ogg audio/x-aiff audio/flac  application/octetstream).freeze # validate type
  IS_LOCAL = Rails.env.test? || Rails.env.development? && ENV.key?('LOCAL_UPLOADS')
  plugin :direct_upload, max_size: MAX_FILESIZE, presign: !IS_LOCAL
  plugin :remote_url,    max_size: MAX_FILESIZE
  plugin :hooks

  plugin :included do |name|
    before_save do
      if send("#{name}_data_changed?") && send(name) && send(name).storage_key == "cache"
        self.waveform = send(name).metadata["peaks"]
      end
    end
  end

  def around_upload(io, context)
    @super_audio = super
    if context[:phase] == :cache
      if io.respond_to?(:tempfile)

        audio = FFMPEG::Movie.new(io.tempfile.path)
        wav = Tempfile.new(['forwaveform', '.wav'])
        audio.transcode(wav.path)
        length = 60
        info = WaveFile::Reader.info(wav.path)
        sample_size = info.sample_frame_count / length
        peaks = []
        WaveFile::Reader.new(wav.path, WaveFile::Format.new(:mono, :float, 44_100)) do |reader|

          reader.each_buffer(sample_size) do |buffer|
            intermediary = []
            steps = buffer.samples.length / 10
            (0..9).each do |step|
              intermediary.push(buffer.samples[step * steps].round(2))
            end

            peaks.push(intermediary.max)
            peaks.push(intermediary.min)
          end

          @super_audio.metadata.update("peaks" => peaks)
        end
      end
    end
  end
end

the upload happens first and then afterward a form is filled out the track is created.

thank you so much
please let me know if you need more information

Metadata extraction

Hi,
I am following your documentation to extract more metadata (exif) but I face a strange behavior.
In the extract_metadata method, the io object does not have a path method, neither a valid tempfile object.
Using the following code, I get a undefined method 'tempfile' for #<ImageUploader::UploadedFile.
Any advice / Idea ?

class ImageUploader

...

  def extract_metadata(io, context)
    metadata = super
    metadata['exif'] = Exif::Data.new(io.tempfile.path)
    metadata
  end

...

end

using Rails 4.2.5, ActiveRecord and the following plugins:

  • Shrine.plugin :active record
  • Shrine.plugin :logging, logger: Rails.logger
  • Shrine.plugin :direct_upload, presign: true
  • Shrine.plugin :determine_mime_type
  • plugin :remove_attachment
  • plugin :store_dimensions
  • plugin :validation_helpers
  • plugin :pretty_location
  • plugin :versions, names: [:original]

Thank you

Moving files on S3 from cache to store

I have my application setup and working very nicely with direct to s3 uploads using shrine.

Looking into the delete_uploaded plugin, I notice that it does not support UploadedFile and therefore will not operate on S3 cache to S3 store.

I notice the move_to method on aws-sdk: http://docs.aws.amazon.com/sdkforruby/api/Aws/S3/Object.html#move_to-instance_method

Have you evaluated this at all? I would think that we can get the functionality of delete_upload on s3 to s3 copying if we use this move_to method.

But also, I notice move_to just calls copy_to and then delete, leading me to think that all the copy operations are synchronous an therefore ok to use?

Have you evaluated this at all? If it is possible, do you think this should be part of the delete_uploaded plugin or a separate plugin?

S3 upload of videos (mp4) from an uploader to 'store' storage expects content_md5 to be set in options hash with md5 at Shrine::Storage::S3#put

Hi, I am using shrine to upload videos to S3 from a Rails app.

My mongoid model for storing video looks like this:

class Video
  include Mongoid::Document
  include Mongoid::Timestamps
  include VideoUploader[:file]

  field :file_data, type: String
end

And VideoUploader for the same looks like this:

class VideoUploader < Shrine
  plugin :determine_mime_type
  plugin :remove_attachment
  plugin :store_dimensions
  plugin :validation_helpers
  plugin :versions, names: [:default]
  plugin :moving, storages: [:cache]

  def process(io, context)
    case context[:phase]
    when :store
      original  = io.download

      { default: original }
    end
  end
end

From a Rails form I directly upload video to S3 cache storage using jquery-fileupload

And I am using a standard Sidekiq worker with Shrine::Attacher.promote(data) for moving video from cache to store storage.

When this background job runs, I get this stacktrace:

2016-04-22T11:31:46.853Z 22691 TID-12o5n0 WARN: Aws::S3::Errors::BadDigest: The Content-MD5 you specified did not match what we received.
2016-04-22T11:31:46.854Z 22691 TID-12o5n0 WARN: /home/dev/apps/staging/shared/bundle/ruby/2.2.0/gems/aws-sdk-core-2.1.18/lib/seahorse/client/plugins/raise_response_errors.rb:15:in `call'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/gems/aws-sdk-core-2.1.18/lib/aws-sdk-core/plugins/s3_sse_cpk.rb:18:in `call'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/gems/aws-sdk-core-2.1.18/lib/aws-sdk-core/plugins/param_converter.rb:21:in `call'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/gems/aws-sdk-core-2.1.18/lib/seahorse/client/plugins/response_target.rb:21:in `call'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/gems/aws-sdk-core-2.1.18/lib/seahorse/client/request.rb:70:in `send_request'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/gems/aws-sdk-core-2.1.18/lib/seahorse/client/base.rb:207:in `block (2 levels) in define_operation_methods'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/gems/aws-sdk-resources-2.1.18/lib/aws-sdk-resources/request.rb:24:in `call'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/gems/aws-sdk-resources-2.1.18/lib/aws-sdk-resources/operations.rb:41:in `call'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/gems/aws-sdk-resources-2.1.18/lib/aws-sdk-resources/operation_methods.rb:19:in `block in add_operation'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/bundler/gems/shrine-8a0623979c6e/lib/shrine/storage/s3.rb:241:in `put'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/bundler/gems/shrine-8a0623979c6e/lib/shrine/storage/s3.rb:128:in `upload'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/bundler/gems/shrine-8a0623979c6e/lib/shrine.rb:337:in `copy'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/bundler/gems/shrine-8a0623979c6e/lib/shrine/plugins/moving.rb:45:in `copy'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/bundler/gems/shrine-8a0623979c6e/lib/shrine.rb:332:in `put'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/bundler/gems/shrine-8a0623979c6e/lib/shrine.rb:316:in `_store'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/bundler/gems/shrine-8a0623979c6e/lib/shrine/plugins/versions.rb:148:in `_store'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/bundler/gems/shrine-8a0623979c6e/lib/shrine/plugins/versions.rb:145:in `block in _store'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/bundler/gems/shrine-8a0623979c6e/lib/shrine/plugins/versions.rb:144:in `each'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/bundler/gems/shrine-8a0623979c6e/lib/shrine/plugins/versions.rb:144:in `inject'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/bundler/gems/shrine-8a0623979c6e/lib/shrine/plugins/versions.rb:144:in `_store'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/bundler/gems/shrine-8a0623979c6e/lib/shrine.rb:243:in `store'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/bundler/gems/shrine-8a0623979c6e/lib/shrine/plugins/logging.rb:82:in `block in store'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/bundler/gems/shrine-8a0623979c6e/lib/shrine/plugins/logging.rb:97:in `block in log'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/bundler/gems/shrine-8a0623979c6e/lib/shrine/plugins/logging.rb:155:in `block in benchmark'
/home/dev/.rvm/rubies/ruby-2.2.2/lib/ruby/2.2.0/benchmark.rb:303:in `realtime'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/bundler/gems/shrine-8a0623979c6e/lib/shrine/plugins/logging.rb:155:in `benchmark'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/bundler/gems/shrine-8a0623979c6e/lib/shrine/plugins/logging.rb:97:in `log'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/bundler/gems/shrine-8a0623979c6e/lib/shrine/plugins/logging.rb:82:in `store'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/bundler/gems/shrine-8a0623979c6e/lib/shrine.rb:214:in `upload'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/bundler/gems/shrine-8a0623979c6e/lib/shrine.rb:606:in `store!'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/bundler/gems/shrine-8a0623979c6e/lib/shrine.rb:529:in `promote'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/bundler/gems/shrine-8a0623979c6e/lib/shrine/plugins/backgrounding.rb:70:in `promote'
/home/dev/apps/staging/releases/20160422111751/app/workers/shrine/upload_job.rb:9:in `perform'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/gems/sidekiq-3.4.1/lib/sidekiq/processor.rb:75:in `execute_job'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/gems/sidekiq-3.4.1/lib/sidekiq/processor.rb:52:in `block (2 levels) in process'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/gems/sidekiq-3.4.1/lib/sidekiq/middleware/chain.rb:127:in `block in invoke'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/gems/sidekiq-3.4.1/lib/sidekiq/middleware/server/retry_jobs.rb:74:in `call'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/gems/sidekiq-3.4.1/lib/sidekiq/middleware/chain.rb:129:in `block in invoke'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/gems/sidekiq-3.4.1/lib/sidekiq/middleware/server/logging.rb:15:in `block in call'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/gems/sidekiq-3.4.1/lib/sidekiq/logging.rb:30:in `with_context'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/gems/sidekiq-3.4.1/lib/sidekiq/middleware/server/logging.rb:11:in `call'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/gems/sidekiq-3.4.1/lib/sidekiq/middleware/chain.rb:129:in `block in invoke'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/gems/sidekiq-3.4.1/lib/sidekiq/middleware/chain.rb:132:in `call'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/gems/sidekiq-3.4.1/lib/sidekiq/middleware/chain.rb:132:in `invoke'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/gems/sidekiq-3.4.1/lib/sidekiq/processor.rb:51:in `block in process'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/gems/sidekiq-3.4.1/lib/sidekiq/processor.rb:98:in `stats'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/gems/sidekiq-3.4.1/lib/sidekiq/processor.rb:50:in `process'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/gems/celluloid-0.16.0/lib/celluloid/calls.rb:26:in `public_send'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/gems/celluloid-0.16.0/lib/celluloid/calls.rb:26:in `dispatch'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/gems/celluloid-0.16.0/lib/celluloid/calls.rb:122:in `dispatch'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/gems/celluloid-0.16.0/lib/celluloid/cell.rb:60:in `block in invoke'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/gems/celluloid-0.16.0/lib/celluloid/cell.rb:71:in `block in task'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/gems/celluloid-0.16.0/lib/celluloid/actor.rb:357:in `block in task'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/gems/celluloid-0.16.0/lib/celluloid/tasks.rb:57:in `block in initialize'
/home/dev/apps/staging/shared/bundle/ruby/2.2.0/gems/celluloid-0.16.0/lib/celluloid/tasks/task_fiber.rb:15:in `block in create'

Looking through the stacktrace and reaching Shrine::Storage::S3#put, I noticed that shrine does not send content_md5 with md5 hash of the file in options hash. And S3 does not seem to be expecting it too, but only in the case of video upload, it seems to be throwing an exception when S3 doesn't find it.

I could fix this by calculating md5 hash for the video being uploaded and add to options hash at Shrine::Storage::S3#put, by referencing aws-sdk docs for AWS::S3::Object#put as such:

def put(io, id, **options)
  options.merge!(content_md5: Digest::MD5.base64digest(File.read(io))) if 
    options[:content_type] == 'video/mp4'
  object(id).put(body: io, **options)
end

Let's add Travis CI

Good day! Thanks for your work!

Let's add Travis CI to Shrine's repositories?

Certain characters break

Hello again @janko-m, I found a quick bug with processing:

If direct to s3 uploads contain that special ${filename} part in the key, the Down library does not properly escape them.

The following error occurs when processing:

Down::NotFound Message: bad URI(is not URI?): https://.../uploads/b969e2dba64b40c3884075bf611a5950cde2f190da1762b38f8222e52693/IMG_20150915_200027[1].jpg?X-Amz-Algorithm=...

Error backtrace:

vendor/bundle/ruby/2.2.0/gems/down-1.1.0/lib/down.rb:47 in rescue in download
vendor/bundle/ruby/2.2.0/gems/down-1.1.0/lib/down.rb:14 in download
vendor/bundle/ruby/2.2.0/gems/shrine-1.2.0/lib/shrine/storage/s3.rb:130 in download
vendor/bundle/ruby/2.2.0/gems/shrine-1.2.0/lib/shrine.rb:742 in download
app/uploaders/attachment_uploader.rb:20 in process
vendor/bundle/ruby/2.2.0/gems/shrine-1.2.0/lib/shrine.rb:349 in processed

Now the url method produces the following url for the same resource, which works:

https://.../uploads/b969e2dba64b40c3884075bf611a5950cde2f190da1762b38f8222e52693/IMG_20150915_200027%5B1%5D.jpg?X-Amz-Algorithm=...

Notice that the [1] part of the filename is is escaped %5B1%5D in the url version, and not escaped in the down version.

This page contains character escaping info: http://docs.aws.amazon.com/AmazonS3/latest/dev/UsingMetadata.html


Also, note that this happened because I generated the presigned_post myself and did not use the direct_upload plugin. Perhaps you may not want to support using the ${filename} syntax on the s3 key, which is fine in my opinion. I am fixing this in my application by removing the ${filename} part.

PDF upload corrupted

Using Rails 4 and the ActiveRecord Shrine plugin, there seems to be file corruption in PDF uploads.

At some point between being a Rack file coming into Rails and the process method being called, the copy made by Shrine into the cache is being corrupted consistently.

# Original
$ ll /tmp/*.pdf
-rw-------. 1 foo bar 2520603 Mar 17 16:28 /tmp/RackMultipart20160317-60267-1082pik.pdf

# Shrine's Cache
$ ll /project/public/uploads/cache/
... snip ...
-rw-r--r--. 1 501 foo 2520347 Mar 17 16:28 27f745231e058660d965d47bfe44fce7eb30a2c96e837227d4917567f1d1.pdf
... snip ...

Filesizes

Original: 2520603
Shrine: 2520347

I've had a bit of trouble following the stacktrace to see at what exact point the io param of process copies the original Rack file to see what could be causing this, so if you know where please let me know and I can keep hunting. Thanks.

File used: https://www.gnu.org/software/emacs/manual/pdf/emacs.pdf

File isn't stored

Using basic config and attaching the file in active record. When i save the file its only stored in cache, its never uploaded to store.

I added logging to #process and i can see that its only called with the cache context

  def process(io, context)
    Rails.logger.debug context
    nil
  end

  Photo.create(file: File.open("somefile"))
  # => {:name=>:file, :record=>#<Photo id: nil, file_data: nil, created_at: nil, updated_at: nil>, :phase=>:cache}

strip

firt great work, and i'm looking to move from carrierwave to shrine.

just one doubt, can i change the image quality to reduce the size of image on upload?

a exemple with minimagick and carrierwave strip method
module CarrierWave
module MiniMagick
# Strips out all embedded information from the image
def strip
manipulate! do |img|
img.strip
img = yield(img) if block_given?
img
end
end

# Reduces the quality of the image to the percentage given
def quality(percentage)
  manipulate! do |img|
    img.quality(percentage.to_s)
    img = yield(img) if block_given?
    img
  end
end

end
end

Warning: Library not loaded: /usr/local/lib/libpng

Hi,
I just added generating thumbnails as readme suggests, but the result is not a hash where I can pass thumb, its still an upload.

Upload.last.attachment.class
  Upload Load (4.6ms)  SELECT  "uploads".* FROM "uploads"  ORDER BY "uploads"."id" DESC LIMIT 1
=> AttachmentUploader::UploadedFile

When I use pry to see whats happening there and run the resize method myself I get this output in console, but no exception, so its easy to overlook:

dyld: Library not loaded: /usr/local/lib/libpng15.15.dylibo.download, 300, 300)
  Referenced from: /usr/local/lib/libfreetype.6.dylib
  Reason: image not found

Although I have libpng installed using brew, not sure what the problem is, but maybe it would be nice to raise if there is an error like that?

NoMethodError: undefined method `find_record' for Shrine::Attacher:Class

The background attacher seems ot be missing a method, called here:
https://github.com/janko-m/shrine/blob/47484f9d9c7833b0bed228f8c0d887b7d33757de/lib/shrine/plugins/background_helpers.rb#L80

require "shrine"
require "shrine/storage/file_system"
require "shrine/storage/s3"

s3_options = {
  access_key_id:     ENV['AWS_ACCESS_KEY_ID'],      # "xyz"
  secret_access_key: ENV['AWS_SECRET_ACCESS_KEY'],  # "abc"
  region:            ENV['AWS_REGION'],             # "eu-west-1"
  bucket:            ENV['AWS_BUCKET'],             # "my-app"
}

Shrine.storages = {
  cache: Shrine::Storage::S3.new(prefix: "cache", **s3_options),
  store: Shrine::Storage::S3.new(prefix: "store", **s3_options),
}

Shrine.plugin :direct_upload, presign: true, max_size: nil #20*1024*1024
Shrine.plugin :background_helpers

Shrine::Attacher.promote { |data| ShrineUploadJob.perform_async(data) }
Shrine::Attacher.delete { |data| ShrineDeleteJob.perform_async(data) }

class ShrineUploadJob
  include Sidekiq::Worker
  def perform(data)
    Shrine::Attacher.promote(data)
  end
end

class ShrineDeleteJob
  include Sidekiq::Worker
  def perform(data)
    Shrine::Attacher.delete(data)
  end
end

upload_options + S3 presigning

Looking at lib/shrine/storage/s3.rb:207, I'd expected options set using the upload_options plugin to be taken into consideration when generating a presigned request. However, lib/shrine/plugins/upload_options.rb:23 reveals that these options are only applied on put to the storage.

I was going to submit a PR making the two work together, but realized the block form of upload_options expects an IO object, which we won't have at presigning time… so I'm opening this to document the issue and see you have any thoughts about ways forward.

Upload endpoint only uploading to 'cache' folder

Hey again!

I was able to resolve the previous issue (detail posted there). Now, with the given upload endpoint /attachments/images/cache/upload any uploaded file returns a 200 status in the inspector but when I check my s3 bucket it is only uploaded in the cache folder and store folder is empty.

Plus, since I'm trying to implement fineuploader library, it requires a specific key in the returned JSON response. Currently, I'm getting this response:

{"id":"4a4191c6c43f54c0a1eb2cf482fb3543.PNG","storage":"cache","metadata":{"filename":"IMG_0105.PNG","size":114333,"mime_type":"image/png","width":640,"height":1136}}

Whereas, the fineuploader expects the property{"success":true} in the response, how can I add this value in the response?

I'm not sure if these issues are related to shrine gem or aws, since I'm receiving a different kind of 200 response with this example's upload method (which utilize jquery-file-upload). However if it is not related to this gem please let me know.

regenerating versions doesn't work

Hi Janko,

first of all thanks a lot for the great Shrine, I really like it.

I tried to follow the guide on Reprocessing Versions. It doesn't work because Shrine::Plugins::Activerecord::AttacherMethods.swap must not be private. This is in my Rake task:

Photo.all.each do |photo|
  attacher, attachment = photo.image_attacher, photo.image
  if attacher.stored?
    file_500 = resize_to_limit(attachment[:large].download, 500, 500)
    medium = attacher.store!(file_500, version: :medium)
    attacher.swap(attachment.merge(medium: medium))
  end
end

Sorry I can't provide a pull request right now.

Cheers
Kai

Cannot load such file -- down/version

Hi @janko-m

I'm trying your gem and get this error:

/Users/eduardo/.rbenv/versions/2.3.0/lib/ruby/gems/2.3.0/gems/activesupport-5.0.0.rc1/lib/active_support/dependencies.rb:293:inrequire': cannot load such file -- down/version (LoadError)`

I'm using Rails 5.0.0.rc1. Any help?

Can't presign with options

Hi, I am unable to use presigned request with additional options.

Using pry I open up the code on where it fails and its this line from direct upload plugin:

signature = @uploader.storage.presign(location, options || {})

Looking at what is there:

> [location, options]
["0042c4091bed9e492c5d54be2c92ffe38a4a6049f68e282eb3f0d0db5bfd/${filename}", {"success_action_redirect"=>"/api/uploads/callback?parent_id=1386&parent_type=Card"}]

This is the error that I get:

> signature = @uploader.storage.presign(location, options || {})
Wrong number of arguments (2 for 1)
===================================
[0] /vendor/gems/shrine-1.2.0/lib/shrine/storage/s3.rb:206:in `presign'
    201:       # Returns a signature for direct uploads. Internally it calls
    202:       # [`Aws::S3::Bucket#presigned_post`], and forwards any additional options
    203:       # to it.
    204:       #
    205:       # [`Aws::S3::Bucket#presigned_post`]: http://docs.aws.amazon.com/sdkforruby/api/Aws/S3/Bucket.html#presigned_post-instance_method
 => 206:       def presign(id, **options)
    207:         options = upload_options.merge(options)
    208:         object(id).presigned_post(options)
    209:       end
    210: 
    211:       protected
[21] /vendor/gems/shrine-1.2.0/lib/shrine/plugins/direct_upload.rb:195:in `block (3 levels) in <class:App>'
[25] /vendor/gems/shrine-1.2.0/lib/shrine/plugins/direct_upload.rb:191:in `block (2 levels) in <class:App>'
[28] /vendor/gems/shrine-1.2.0/lib/shrine/plugins/direct_upload.rb:180:in `block in <class:App>'

When the second argument is empty:

> [location, options]
["2376793e32b3fc1874ca9e578e9698725e555bdeb4792d39bb0fd73513f3/${filename}", {}]

it passes fine.

> signature = @uploader.storage.presign(location, options || {})
=> #<Aws::S3::PresignedPost:0x007fbbf0c35858
 @bucket_name="......",
 @bucket_region="eu-central-1",
 @conditions=[{"bucket"=>"....."}, ["starts-with", "$key", "cache/39dc821e56780a10c513a5e37a45c64b9f7c9945e0825d11ab96c44f5bff/"]],
 @credentials=#<Aws::Credentials access_key_id=".........">,
 @fields={"key"=>"cache/39dc821e56780a10c513a5e37a45c64b9f7c9945e0825d11ab96c44f5bff/${filename}"},
 @key_set=true,
 @signature_expiration=2016-02-08 11:31:01 +0100,
 @url="https://mozek.s3.eu-central-1.amazonaws.com">

I was looking into how the keyword arguments method work and couldn't find anything wrong with shrine's code.

I checked the method's arity, and its -2, which means 1 required argument and variable more. Which makes sense when looking at the code, but not when looking at the behavior.

Also tried calling it with **options, but no luck:

> signature = @uploader.storage.presign(location, **(options || {}))
Wrong argument type string (expected symbol)
============================================
[20] /vendor/gems/shrine-1.2.0/lib/shrine/plugins/direct_upload.rb:195:in `block (3 levels) in <class:App>'
    190: 
    191:             r.get "presign" do
    192:               location = SecureRandom.hex(30) + request.params["extension"].to_s
    193:               options = presign.call(request) if presign.respond_to?(:call)
    194: 
 => 195:               binding.pry
    196:               signature = @uploader.storage.presign(location, **(options || {}))
    197: 
    198:               json Hash[url: signature.url, fields: signature.fields]
    199:             end if presign
    200:           end
[24] /vendor/gems/shrine-1.2.0/lib/shrine/plugins/direct_upload.rb:191:in `block (2 levels) in <class:App>'
[27] /vendor/gems/shrine-1.2.0/lib/shrine/plugins/direct_upload.rb:180:in `block in <class:App>'

I am using Ruby 2.1.5, but also tested with 2.2.1.

Any ideas?

Default image from disk

Is it possible to define a default image in shrine?

The intended behaviour would be something like:

image_tag(user.avatar_url || default_user_avatar)

But without having to do this every time (or having to do this on the User model). It would also be helpful if it integrated with the versions plugin.

logger value doesn't inherited

Shrine.plugin :logging, logger: Rails.logger

class SomeUploader < Shrine
end

Shrine.logger # inspected value is: <ActiveSupport::Logger: ...
SomeUploader.logger # inspected value is: <Logger: ...

Upload endpoint not working

I'm trying to integrate this gem with fineuploader fron-end library to enhance the user experience (http://fineuploader.com/). The fineupload plugin require us to specify the URL where it should send uploads and server requests to. This endpoint can be an absolute path or a relative path. More information could be found here: http://docs.fineuploader.com/quickstart/02-setting_options.html

I've read shrine gem docs section: https://github.com/janko-m/shrine#direct-uploads and I've setup my endpoint in routes.rb as mount ImageUploader::UploadEndpoint, at: "/attachments/images" and I'm setting my endpoint in fineupload initializing function as:

    $("#fine-uploader-gallery").fineUploader({
        request: {
            endpoint: '/attachments/images/cache/upload'
        },
    });

But on network tab of chrome inspector, I can see that post request to this URL cause a status of 404. What could be the issue? Do I need to pass some parameters along with this endpoint? if so how to receive those parameters in my fineuploader function?

PS: I'm using this repository as base: https://github.com/erikdahlstrand/shrine-rails-example and I want to use fineuploader instead of jQuery file upload library.

Breaking change on master

This used to work

    @upload = Upload.create!(resource_params)

With attributes passed being

{
 #...
 "attachment"=>
  {"id"=>
    "2162ef5ccc922cf071e81504c5cf08e60d135e7144887cb0565eb85ba021/Screen Shot 2016-01-25 at 15.16.39.png",
   "storage"=>"cache",
   "metadata"=>
    {"width"=>342,
     "height"=>348,
     "size"=>73681,
     "filename"=>"Screen Shot 2016-01-25 at 15.16.39.png",
     "mime_type"=>"image/png"}},
 }

Now I get

Shrine::Error: unknown version: "id"`
from: /Users/michal/.rvm/gems/ruby-2.2.1/bundler/gems/shrine-01f20b64ef4a/lib/shrine/plugins/versions.rb" line 105 in block in versions!

If I go back to earlier revision 3ab14cb1ad71e1254b025bf4ae721c3fed52a16f the code works.

IRC? Gitter?

Hi Janko,
Any ideas surrounding creating a IRC channel (Freenode?) or Gitter to discuss stuff regarding Shrine?

Processing with backgrounding

I'm using the background jobs plugin to promote from cache to store in a background job. However, when combined with image processing, it simply fails, I think because the promote method never calls the process method.

If I do the resizing in the cache phase it works fine, and if I disable the background jobs plugin and do the processing in the store phase it also works fine.

Is this a bug, or am I just using it wrongly?

Unable to get url of s3 file

The following call record.file.url fails in case metadata[id] contains space

bad URI(is not URI?): invoice/2016/sample file.pdf
...
"/var/lib/gems/2.2.0/gems/shrine-1.3.0/lib/shrine/storage/s3.rb:194:in `url'"
...

It looks like file is uploaded correctly but I'm not able to get its url.
Maybe id should be Uri.encoded before joining with host.

How to proxy caching images [:store, :cache] using nginx with docker?

Here i run two containers: one for the application (with puma), and another with nginx. As used carrierware, i managed to get the pictures they were footprints by nginx for proxy caching, but shrine can not do.

Could someone send me a suggestion Dockerfile or docker-compose.yml making this option?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.