Make user special access case insensitive#6764
Conversation
cjcolvar
left a comment
There was a problem hiding this comment.
I was thinking about this and I'm wondering if it might make more sense to add before_save callbacks on the models instead of doing the downcasing the next layer up in the controller/jobs. Either way I think we'll probably need a migration to downcase existing read_users which could be expensive. We might be able to do a solr query for uppercase characters to try and avoid migrating every collection and media object in the repository. Alternatively we could not change how they're stored and try to find a way to downcase before querying. But that seems like it would probably be tricky and involve overrides. Thoughts?
Related issue: #6742
Co-authored-by: Chris Colvard <chris.colvard@gmail.com> The `read_users` field on the media object is not an active fedora property so `read_users.changed?` does not exist, so I think that the before_save callback has to run everyt time instead of being proc-ed to only when the field has been updated.
bb787d2 to
f245f44
Compare
cjcolvar
left a comment
There was a problem hiding this comment.
This is looking great!
Unfortunately, I do have a couple thoughts which could expand the scope of this PR more so see if you think it would be best to handle it here or in another PR or another time (which could be never 😆):
So if usernames in special access are getting downcased then shouldn't we handle usernames across the application as downcased? If so, then unit and collection roles as well as Admin::Group members should all be downcased. (There might possibly be other places where usernames are getting used in playlists/timelines?) And if all of the string references to users are being downcased then we should probably downcase the username/email in the user model too. If those changes are made in before_save callbacks on the models then we probably don't need the downcases throughout the application. We would have to migrate more objects but the ActiveRecord user objects should be fast.
| end | ||
| desc "Downcase existing special access user entries" | ||
| task special_access_users: :environment do | ||
| ids = ActiveFedora::SolrService.query('inheritable_read_access_person_ssim:/.*[A-Z].*/ OR read_access_person_ssim:/.*[A-Z].*/', fl: 'id', rows: 100_000) |
There was a problem hiding this comment.
I don't think 100,000 rows is necessarily always going to be large enough. Maybe this needs to be converted into a loop where it paginates while start + rows < numFound? It could use a smaller batch size then (1000 is a default in ActiveFedora).
I like how this can be run as many times as needed until everything has been migrated. Another idea is adding a logging line about number of rows found and number migrated in the end to help give some feedback that things worked and fully completed. In this vein, maybe the save! could be changed into a save and the error logged if the save failed? Then the migration could keep running and do the most it can do instead of getting hung up on a broken item.
Co-authored-by: Chris Colvard <chris.colvard@gmail.com>
|
I think there is possibly too much to that discussion for the comment section here. Let's try to meet up synchronously sometime this week and hash all that out. |
Related issue: #6742