Looking for an alternative to filters / observers for a Ruby on Rails project

Rails has a good set of filters (before_validation, before_create, after_save, etc.) as well as observer support, but I've run into a situation where relying on a filter or observer is too computationally expensive. I need an alternative.

Problem: I am logging web server requests for a large number of pages. I need a trigger that will take an action (like send an email) when a given page has been viewed more than X times. Due to the sheer number of pages and hits, using a filter or watcher will lead to a lot of wasted time because 99% of the time the condition it checks will be false. The letter should not be sent straight away (ie 5-10 minutes delay is acceptable).

Instead, I use some kind of process that runs the database every 5 minutes or so and checks which pages have suffered more than X times by writing that state to a new DB table and then sending the appropriate email address. It's not exactly elegant, but it will work.

Does anyone else have a better idea?

0


a source to share


4 answers


The robbery modes are good! But in the end, you will create more custom code for each background job you add. Check out the Delayed Job plugin http://blog.leetsoft.com/2008/2/17/delayed-job-dj

DJ is an asynchronous priority queue that relies on one simple database table. According to the DJ site, you can create a job using the Delayed :: Job.enqueue () method shown below.



class NewsletterJob < Struct.new(:text, :emails)
  def perform
    emails.each { |e| NewsletterMailer.deliver_text_to_email(text, e) }
  end    
end  

Delayed::Job.enqueue( NewsletterJob.new("blah blah", Customers.find(:all).collect(&:email)) )

      

+1


a source


I was once part of a team that wrote their own ad server that has the same requirements: keep track of how many times a document is accessed and do something when it reaches a certain threshold. This server will connect to an existing very large site with a lot of traffic, and scalability is a major issue. My company hired two Doubleclick consultants to take their brains out.

Their opinion was: the fastest way to store any information is to write it to a custom Apache directive. So we created a site where every time someone hit a document (ad, page, whatever), the server that was processing the request would write a SQL statement to the log: "INSERT INTO impressions (timestamp, page, ip and etc.)) VALUES (x, 'path / to / doc', y, etc.); - everything is dynamically displayed with data from the web server. Every 5 minutes we collect these files from web servers, and then we dump them all in the main database one at a time, then, at our leisure, we could parse that data to do whatever we really liked.



Depending on your exact requirements and deployment setup, you can do something like this. The requirement to compute to check if you've passed a certain threshold is still probably even smaller (guessing here) than doing SQL to increment a value or insert a row. You can get rid of both bits of overhead by logging hits (special format or not) and then periodically collecting them, analyzing them, entering them into the database, and doing whatever you want with them.

+1


a source


When saving your Hit model, update the fallback column in the page model that stores the total number of views, this will take 2 additional queries, so maybe each hit takes twice as long to process, but you can decide to send an email with a simple , if.

Your original solution is not bad.

0


a source


I need to write something here to make stackoverflow code highlight the first line.

class ApplicationController < ActionController::Base
  before_filter :increment_fancy_counter

  private

  def increment_fancy_counter
    # somehow increment the counter here
  end
end

# lib/tasks/fancy_counter.rake
namespace :fancy_counter do
  task :process do
    # somehow process the counter here
  end
end

      

Run a cron job rake fancy_counter:process

, but often you want to run it.

0


a source







All Articles