mirror of
https://github.com/Freika/dawarich.git
synced 2025-12-16 18:26:09 -06:00
0.36.3 (#2013)
* fix: move foreman to global gems to fix startup crash (#1971) * Update exporting code to stream points data to file in batches to red… (#1980) * Update exporting code to stream points data to file in batches to reduce memory usage * Update changelog * Update changelog * Feature/maplibre frontend (#1953) * Add a plan to use MapLibre GL JS for the frontend map rendering, replacing Leaflet * Implement phase 1 * Phases 1-3 + part of 4 * Fix e2e tests * Phase 6 * Implement fog of war * Phase 7 * Next step: fix specs, phase 7 done * Use our own map tiles * Extract v2 map logic to separate manager classes * Update settings panel on v2 map * Update v2 e2e tests structure * Reimplement location search in maps v2 * Update speed routes * Implement visits and places creation in v2 * Fix last failing test * Implement visits merging * Fix a routes e2e test and simplify the routes layer styling. * Extract js to modules from maps_v2_controller.js * Implement area creation * Fix spec problem * Fix some e2e tests * Implement live mode in v2 map * Update icons and panel * Extract some styles * Remove unused file * Start adding dark theme to popups on MapLibre maps * Make popups respect dark theme * Move v2 maps to maplibre namespace * Update v2 references to maplibre * Put place, area and visit info into side panel * Update API to use safe settings config method * Fix specs * Fix method name to config in SafeSettings and update usages accordingly * Add missing public files * Add handling for real time points * Fix remembering enabled/disabled layers of the v2 map * Fix lots of e2e tests * Add settings to select map version * Use maps/v2 as main path for MapLibre maps * Update routing * Update live mode * Update maplibre controller * Update changelog * Remove some console.log statements * Pull only necessary data for map v2 points * Feature/raw data archive (#2009) * 0.36.2 (#2007) * fix: move foreman to global gems to fix startup crash (#1971) * Update exporting code to stream points data to file in batches to red… (#1980) * Update exporting code to stream points data to file in batches to reduce memory usage * Update changelog * Update changelog * Feature/maplibre frontend (#1953) * Add a plan to use MapLibre GL JS for the frontend map rendering, replacing Leaflet * Implement phase 1 * Phases 1-3 + part of 4 * Fix e2e tests * Phase 6 * Implement fog of war * Phase 7 * Next step: fix specs, phase 7 done * Use our own map tiles * Extract v2 map logic to separate manager classes * Update settings panel on v2 map * Update v2 e2e tests structure * Reimplement location search in maps v2 * Update speed routes * Implement visits and places creation in v2 * Fix last failing test * Implement visits merging * Fix a routes e2e test and simplify the routes layer styling. * Extract js to modules from maps_v2_controller.js * Implement area creation * Fix spec problem * Fix some e2e tests * Implement live mode in v2 map * Update icons and panel * Extract some styles * Remove unused file * Start adding dark theme to popups on MapLibre maps * Make popups respect dark theme * Move v2 maps to maplibre namespace * Update v2 references to maplibre * Put place, area and visit info into side panel * Update API to use safe settings config method * Fix specs * Fix method name to config in SafeSettings and update usages accordingly * Add missing public files * Add handling for real time points * Fix remembering enabled/disabled layers of the v2 map * Fix lots of e2e tests * Add settings to select map version * Use maps/v2 as main path for MapLibre maps * Update routing * Update live mode * Update maplibre controller * Update changelog * Remove some console.log statements --------- Co-authored-by: Robin Tuszik <mail@robin.gg> * Remove esbuild scripts from package.json * Remove sideEffects field from package.json * Raw data archivation * Add tests * Fix tests * Fix tests * Update ExceptionReporter * Add schedule to run raw data archival job monthly * Change file structure for raw data archival feature * Update changelog and version for raw data archival feature --------- Co-authored-by: Robin Tuszik <mail@robin.gg> * Set raw_data to an empty hash instead of nil when archiving * Fix storage configuration and file extraction * Consider MIN_MINUTES_SPENT_IN_CITY during stats calculation (#2018) * Consider MIN_MINUTES_SPENT_IN_CITY during stats calculation * Remove raw data from visited cities api endpoint * Use user timezone to show dates on maps (#2020) * Fix/pre epoch time (#2019) * Use user timezone to show dates on maps * Limit timestamps to valid range to prevent database errors when users enter pre-epoch dates. * Limit timestamps to valid range to prevent database errors when users enter pre-epoch dates. * Fix tests failing due to new index on stats table * Fix failing specs * Update redis client configuration to support unix socket connection * Update changelog * Fix kml kmz import issues (#2023) * Fix kml kmz import issues * Refactor KML importer to improve readability and maintainability * Implement moving points in map v2 and fix route rendering logic to ma… (#2027) * Implement moving points in map v2 and fix route rendering logic to match map v1. * Fix route spec * fix(maplibre): update date format to ISO 8601 (#2029) * Add verification step to raw data archival process (#2028) * Add verification step to raw data archival process * Add actual verification of raw data archives after creation, and only clear raw_data for verified archives. * Fix failing specs * Eliminate zip-bomb risk * Fix potential memory leak in js * Return .keep files * Use Toast instead of alert for notifications * Add help section to navbar dropdown * Update changelog * Remove raw_data_archival_job * Ensure file is being closed properly after reading in Archivable concern --------- Co-authored-by: Robin Tuszik <mail@robin.gg>
This commit is contained in:
@@ -1 +1 @@
|
||||
0.36.2
|
||||
0.36.3
|
||||
|
||||
18
CHANGELOG.md
18
CHANGELOG.md
@@ -4,6 +4,24 @@ All notable changes to this project will be documented in this file.
|
||||
The format is based on [Keep a Changelog](http://keepachangelog.com/)
|
||||
and this project adheres to [Semantic Versioning](http://semver.org/).
|
||||
|
||||
# [0.36.3] - 2025-12-14
|
||||
|
||||
## Added
|
||||
|
||||
- Setting `ARCHIVE_RAW_DATA` env var to true will enable monthly raw data archiving for all users. It will look for points older than 2 months with `raw_data` column not empty and create a zip archive containing raw data files for each month. After successful archiving, raw data will be removed from the database to save space. Monthly archiving job is being run every day at 2:00 AM. Default env var value is false.
|
||||
- In map v2, user can now move points when Points layer is enabled. #2024
|
||||
- In map v2, routes are now being rendered using same logic as in map v1, route-length-wise. #2026
|
||||
|
||||
## Fixed
|
||||
|
||||
- Cities visited during a trip are now being calculated correctly. #547 #641 #1686 #1976
|
||||
- Points on the map are now show time in user's timezone. #580 #1035 #1682
|
||||
- Date range inputs now handle pre-epoch dates gracefully by clamping to valid PostgreSQL integer range. #685
|
||||
- Redis client now also being configured so that it could connect via unix socket. #1970
|
||||
- Importing KML files now creates points with correct timestamps. #1988
|
||||
- Importing KMZ files now works correctly.
|
||||
- Map settings are now being respected in map v2. #2012
|
||||
|
||||
|
||||
# [0.36.2] - 2025-12-06
|
||||
|
||||
|
||||
3
Gemfile
3
Gemfile
@@ -14,6 +14,7 @@ gem 'bootsnap', require: false
|
||||
gem 'chartkick'
|
||||
gem 'data_migrate'
|
||||
gem 'devise'
|
||||
gem 'foreman'
|
||||
gem 'geocoder', github: 'Freika/geocoder', branch: 'master'
|
||||
gem 'gpx'
|
||||
gem 'groupdate'
|
||||
@@ -55,7 +56,7 @@ gem 'stimulus-rails'
|
||||
gem 'tailwindcss-rails', '= 3.3.2'
|
||||
gem 'turbo-rails', '>= 2.0.17'
|
||||
gem 'tzinfo-data', platforms: %i[mingw mswin x64_mingw jruby]
|
||||
gem 'foreman'
|
||||
gem 'with_advisory_lock'
|
||||
|
||||
group :development, :test, :staging do
|
||||
gem 'brakeman', require: false
|
||||
|
||||
@@ -620,6 +620,9 @@ GEM
|
||||
base64
|
||||
websocket-extensions (>= 0.1.0)
|
||||
websocket-extensions (0.1.5)
|
||||
with_advisory_lock (7.0.2)
|
||||
activerecord (>= 7.2)
|
||||
zeitwerk (>= 2.7)
|
||||
xpath (3.2.0)
|
||||
nokogiri (~> 1.8)
|
||||
zeitwerk (2.7.3)
|
||||
@@ -703,6 +706,7 @@ DEPENDENCIES
|
||||
turbo-rails (>= 2.0.17)
|
||||
tzinfo-data
|
||||
webmock
|
||||
with_advisory_lock
|
||||
|
||||
RUBY VERSION
|
||||
ruby 3.4.6p54
|
||||
|
||||
@@ -2,8 +2,6 @@
|
||||
|
||||
[](https://discord.gg/pHsBjpt5J8) | [](https://ko-fi.com/H2H3IDYDD) | [](https://www.patreon.com/freika)
|
||||
|
||||
[](https://app.circleci.com/pipelines/github/Freika/dawarich)
|
||||
|
||||
---
|
||||
|
||||
## 📸 Screenshots
|
||||
|
||||
File diff suppressed because one or more lines are too long
1
app/assets/svg/icons/lucide/outline/arrow-big-down.svg
Normal file
1
app/assets/svg/icons/lucide/outline/arrow-big-down.svg
Normal file
@@ -0,0 +1 @@
|
||||
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-arrow-big-down-icon lucide-arrow-big-down"><path d="M15 11a1 1 0 0 0 1 1h2.939a1 1 0 0 1 .75 1.811l-6.835 6.836a1.207 1.207 0 0 1-1.707 0L4.31 13.81a1 1 0 0 1 .75-1.811H8a1 1 0 0 0 1-1V5a1 1 0 0 1 1-1h4a1 1 0 0 1 1 1z"/></svg>
|
||||
|
After Width: | Height: | Size: 429 B |
@@ -0,0 +1 @@
|
||||
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-message-circle-question-mark-icon lucide-message-circle-question-mark"><path d="M2.992 16.342a2 2 0 0 1 .094 1.167l-1.065 3.29a1 1 0 0 0 1.236 1.168l3.413-.998a2 2 0 0 1 1.099.092 10 10 0 1 0-4.777-4.719"/><path d="M9.09 9a3 3 0 0 1 5.83 1c0 2-3 3-3 3"/><path d="M12 17h.01"/></svg>
|
||||
|
After Width: | Height: | Size: 485 B |
@@ -1,14 +1,17 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
class Api::V1::Countries::VisitedCitiesController < ApiController
|
||||
include SafeTimestampParser
|
||||
|
||||
before_action :validate_params
|
||||
|
||||
def index
|
||||
start_at = DateTime.parse(params[:start_at]).to_i
|
||||
end_at = DateTime.parse(params[:end_at]).to_i
|
||||
start_at = safe_timestamp(params[:start_at])
|
||||
end_at = safe_timestamp(params[:end_at])
|
||||
|
||||
points = current_api_user
|
||||
.points
|
||||
.without_raw_data
|
||||
.where(timestamp: start_at..end_at)
|
||||
|
||||
render json: { data: CountriesAndCities.new(points).call }
|
||||
|
||||
@@ -1,12 +1,14 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
class Api::V1::PointsController < ApiController
|
||||
include SafeTimestampParser
|
||||
|
||||
before_action :authenticate_active_api_user!, only: %i[create update destroy bulk_destroy]
|
||||
before_action :validate_points_limit, only: %i[create]
|
||||
|
||||
def index
|
||||
start_at = params[:start_at]&.to_datetime&.to_i
|
||||
end_at = params[:end_at]&.to_datetime&.to_i || Time.zone.now.to_i
|
||||
start_at = params[:start_at].present? ? safe_timestamp(params[:start_at]) : nil
|
||||
end_at = params[:end_at].present? ? safe_timestamp(params[:end_at]) : Time.zone.now.to_i
|
||||
order = params[:order] || 'desc'
|
||||
|
||||
points = current_api_user
|
||||
|
||||
24
app/controllers/concerns/safe_timestamp_parser.rb
Normal file
24
app/controllers/concerns/safe_timestamp_parser.rb
Normal file
@@ -0,0 +1,24 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
module SafeTimestampParser
|
||||
extend ActiveSupport::Concern
|
||||
|
||||
private
|
||||
|
||||
def safe_timestamp(date_string)
|
||||
return Time.zone.now.to_i if date_string.blank?
|
||||
|
||||
parsed_time = Time.zone.parse(date_string)
|
||||
|
||||
# Time.zone.parse returns epoch time (2000-01-01) for unparseable strings
|
||||
# Check if it's a valid parse by seeing if year is suspiciously at epoch
|
||||
return Time.zone.now.to_i if parsed_time.nil? || (parsed_time.year == 2000 && !date_string.include?('2000'))
|
||||
|
||||
min_timestamp = Time.zone.parse('1970-01-01').to_i
|
||||
max_timestamp = Time.zone.parse('2100-01-01').to_i
|
||||
|
||||
parsed_time.to_i.clamp(min_timestamp, max_timestamp)
|
||||
rescue ArgumentError, TypeError
|
||||
Time.zone.now.to_i
|
||||
end
|
||||
end
|
||||
@@ -1,6 +1,8 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
class Map::LeafletController < ApplicationController
|
||||
include SafeTimestampParser
|
||||
|
||||
before_action :authenticate_user!
|
||||
layout 'map', only: :index
|
||||
|
||||
@@ -71,14 +73,14 @@ class Map::LeafletController < ApplicationController
|
||||
end
|
||||
|
||||
def start_at
|
||||
return Time.zone.parse(params[:start_at]).to_i if params[:start_at].present?
|
||||
return safe_timestamp(params[:start_at]) if params[:start_at].present?
|
||||
return Time.zone.at(points.last.timestamp).beginning_of_day.to_i if points.any?
|
||||
|
||||
Time.zone.today.beginning_of_day.to_i
|
||||
end
|
||||
|
||||
def end_at
|
||||
return Time.zone.parse(params[:end_at]).to_i if params[:end_at].present?
|
||||
return safe_timestamp(params[:end_at]) if params[:end_at].present?
|
||||
return Time.zone.at(points.last.timestamp).end_of_day.to_i if points.any?
|
||||
|
||||
Time.zone.today.end_of_day.to_i
|
||||
|
||||
@@ -1,5 +1,7 @@
|
||||
module Map
|
||||
class MaplibreController < ApplicationController
|
||||
include SafeTimestampParser
|
||||
|
||||
before_action :authenticate_user!
|
||||
layout 'map'
|
||||
|
||||
@@ -11,13 +13,13 @@ module Map
|
||||
private
|
||||
|
||||
def start_at
|
||||
return Time.zone.parse(params[:start_at]).to_i if params[:start_at].present?
|
||||
return safe_timestamp(params[:start_at]) if params[:start_at].present?
|
||||
|
||||
Time.zone.today.beginning_of_day.to_i
|
||||
end
|
||||
|
||||
def end_at
|
||||
return Time.zone.parse(params[:end_at]).to_i if params[:end_at].present?
|
||||
return safe_timestamp(params[:end_at]) if params[:end_at].present?
|
||||
|
||||
Time.zone.today.end_of_day.to_i
|
||||
end
|
||||
|
||||
@@ -1,6 +1,8 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
class PointsController < ApplicationController
|
||||
include SafeTimestampParser
|
||||
|
||||
before_action :authenticate_user!
|
||||
|
||||
def index
|
||||
@@ -40,13 +42,13 @@ class PointsController < ApplicationController
|
||||
def start_at
|
||||
return 1.month.ago.beginning_of_day.to_i if params[:start_at].nil?
|
||||
|
||||
Time.zone.parse(params[:start_at]).to_i
|
||||
safe_timestamp(params[:start_at])
|
||||
end
|
||||
|
||||
def end_at
|
||||
return Time.zone.today.end_of_day.to_i if params[:end_at].nil?
|
||||
|
||||
Time.zone.parse(params[:end_at]).to_i
|
||||
safe_timestamp(params[:end_at])
|
||||
end
|
||||
|
||||
def points
|
||||
|
||||
@@ -7,7 +7,8 @@ export default class extends Controller {
|
||||
|
||||
static values = {
|
||||
features: Object,
|
||||
userTheme: String
|
||||
userTheme: String,
|
||||
timezone: String
|
||||
}
|
||||
|
||||
connect() {
|
||||
@@ -106,7 +107,8 @@ export default class extends Controller {
|
||||
});
|
||||
|
||||
// Format timestamp for display
|
||||
const lastSeen = new Date(location.updated_at).toLocaleString();
|
||||
const timezone = this.timezoneValue || 'UTC';
|
||||
const lastSeen = new Date(location.updated_at).toLocaleString('en-US', { timeZone: timezone });
|
||||
|
||||
// Create small tooltip that shows automatically
|
||||
const tooltipContent = this.createTooltipContent(lastSeen, location.battery);
|
||||
@@ -176,7 +178,8 @@ export default class extends Controller {
|
||||
existingMarker.setIcon(newIcon);
|
||||
|
||||
// Update tooltip content
|
||||
const lastSeen = new Date(locationData.updated_at).toLocaleString();
|
||||
const timezone = this.timezoneValue || 'UTC';
|
||||
const lastSeen = new Date(locationData.updated_at).toLocaleString('en-US', { timeZone: timezone });
|
||||
const tooltipContent = this.createTooltipContent(lastSeen, locationData.battery);
|
||||
existingMarker.setTooltipContent(tooltipContent);
|
||||
|
||||
@@ -214,7 +217,8 @@ export default class extends Controller {
|
||||
})
|
||||
});
|
||||
|
||||
const lastSeen = new Date(location.updated_at).toLocaleString();
|
||||
const timezone = this.timezoneValue || 'UTC';
|
||||
const lastSeen = new Date(location.updated_at).toLocaleString('en-US', { timeZone: timezone });
|
||||
|
||||
const tooltipContent = this.createTooltipContent(lastSeen, location.battery);
|
||||
familyMarker.bindTooltip(tooltipContent, {
|
||||
|
||||
@@ -7,9 +7,17 @@ import { performanceMonitor } from 'maps_maplibre/utils/performance_monitor'
|
||||
* Handles loading and transforming data from API
|
||||
*/
|
||||
export class DataLoader {
|
||||
constructor(api, apiKey) {
|
||||
constructor(api, apiKey, settings = {}) {
|
||||
this.api = api
|
||||
this.apiKey = apiKey
|
||||
this.settings = settings
|
||||
}
|
||||
|
||||
/**
|
||||
* Update settings (called when user changes settings)
|
||||
*/
|
||||
updateSettings(settings) {
|
||||
this.settings = settings
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -30,7 +38,10 @@ export class DataLoader {
|
||||
// Transform points to GeoJSON
|
||||
performanceMonitor.mark('transform-geojson')
|
||||
data.pointsGeoJSON = pointsToGeoJSON(data.points)
|
||||
data.routesGeoJSON = RoutesLayer.pointsToRoutes(data.points)
|
||||
data.routesGeoJSON = RoutesLayer.pointsToRoutes(data.points, {
|
||||
distanceThresholdMeters: this.settings.metersBetweenRoutes || 1000,
|
||||
timeThresholdMinutes: this.settings.minutesBetweenRoutes || 60
|
||||
})
|
||||
performanceMonitor.measure('transform-geojson')
|
||||
|
||||
// Fetch visits
|
||||
|
||||
@@ -18,7 +18,7 @@ export class EventHandlers {
|
||||
|
||||
const content = `
|
||||
<div class="space-y-2">
|
||||
<div><span class="font-semibold">Time:</span> ${formatTimestamp(properties.timestamp)}</div>
|
||||
<div><span class="font-semibold">Time:</span> ${formatTimestamp(properties.timestamp, this.controller.timezoneValue)}</div>
|
||||
${properties.battery ? `<div><span class="font-semibold">Battery:</span> ${properties.battery}%</div>` : ''}
|
||||
${properties.altitude ? `<div><span class="font-semibold">Altitude:</span> ${Math.round(properties.altitude)}m</div>` : ''}
|
||||
${properties.velocity ? `<div><span class="font-semibold">Speed:</span> ${Math.round(properties.velocity)} km/h</div>` : ''}
|
||||
@@ -35,8 +35,8 @@ export class EventHandlers {
|
||||
const feature = e.features[0]
|
||||
const properties = feature.properties
|
||||
|
||||
const startTime = formatTimestamp(properties.started_at)
|
||||
const endTime = formatTimestamp(properties.ended_at)
|
||||
const startTime = formatTimestamp(properties.started_at, this.controller.timezoneValue)
|
||||
const endTime = formatTimestamp(properties.ended_at, this.controller.timezoneValue)
|
||||
const durationHours = Math.round(properties.duration / 3600)
|
||||
const durationDisplay = durationHours >= 1 ? `${durationHours}h` : `${Math.round(properties.duration / 60)}m`
|
||||
|
||||
@@ -70,7 +70,7 @@ export class EventHandlers {
|
||||
const content = `
|
||||
<div class="space-y-2">
|
||||
${properties.photo_url ? `<img src="${properties.photo_url}" alt="Photo" class="w-full rounded-lg mb-2" />` : ''}
|
||||
${properties.taken_at ? `<div><span class="font-semibold">Taken:</span> ${formatTimestamp(properties.taken_at)}</div>` : ''}
|
||||
${properties.taken_at ? `<div><span class="font-semibold">Taken:</span> ${formatTimestamp(properties.taken_at, this.controller.timezoneValue)}</div>` : ''}
|
||||
</div>
|
||||
`
|
||||
|
||||
|
||||
@@ -247,7 +247,9 @@ export class LayerManager {
|
||||
_addPointsLayer(pointsGeoJSON) {
|
||||
if (!this.layers.pointsLayer) {
|
||||
this.layers.pointsLayer = new PointsLayer(this.map, {
|
||||
visible: this.settings.pointsVisible !== false // Default true unless explicitly false
|
||||
visible: this.settings.pointsVisible !== false, // Default true unless explicitly false
|
||||
apiClient: this.api,
|
||||
layerManager: this
|
||||
})
|
||||
this.layers.pointsLayer.add(pointsGeoJSON)
|
||||
} else {
|
||||
|
||||
@@ -173,7 +173,7 @@ export class RoutesManager {
|
||||
timestamp: f.properties.timestamp
|
||||
})) || []
|
||||
|
||||
const distanceThresholdMeters = this.settings.metersBetweenRoutes || 500
|
||||
const distanceThresholdMeters = this.settings.metersBetweenRoutes || 1000
|
||||
const timeThresholdMinutes = this.settings.minutesBetweenRoutes || 60
|
||||
|
||||
const { calculateSpeed, getSpeedColor } = await import('maps_maplibre/utils/speed_colors')
|
||||
|
||||
@@ -22,12 +22,17 @@ export class SettingsController {
|
||||
}
|
||||
|
||||
/**
|
||||
* Load settings (sync from backend and localStorage)
|
||||
* Load settings (sync from backend)
|
||||
*/
|
||||
async loadSettings() {
|
||||
this.settings = await SettingsManager.sync()
|
||||
this.controller.settings = this.settings
|
||||
console.log('[Maps V2] Settings loaded:', this.settings)
|
||||
|
||||
// Update dataLoader with new settings
|
||||
if (this.controller.dataLoader) {
|
||||
this.controller.dataLoader.updateSettings(this.settings)
|
||||
}
|
||||
|
||||
return this.settings
|
||||
}
|
||||
|
||||
@@ -134,8 +139,6 @@ export class SettingsController {
|
||||
if (speedColoredRoutesToggle) {
|
||||
speedColoredRoutesToggle.checked = this.settings.speedColoredRoutes || false
|
||||
}
|
||||
|
||||
console.log('[Maps V2] UI controls synced with settings')
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -154,7 +157,6 @@ export class SettingsController {
|
||||
|
||||
// Reload layers after style change
|
||||
this.map.once('style.load', () => {
|
||||
console.log('Style loaded, reloading map data')
|
||||
this.controller.loadMapData()
|
||||
})
|
||||
}
|
||||
@@ -203,11 +205,17 @@ export class SettingsController {
|
||||
// Apply settings to current map
|
||||
await this.applySettingsToMap(settings)
|
||||
|
||||
// Save to backend and localStorage
|
||||
// Save to backend
|
||||
for (const [key, value] of Object.entries(settings)) {
|
||||
await SettingsManager.updateSetting(key, value)
|
||||
}
|
||||
|
||||
// Update controller settings and dataLoader
|
||||
this.controller.settings = { ...this.controller.settings, ...settings }
|
||||
if (this.controller.dataLoader) {
|
||||
this.controller.dataLoader.updateSettings(this.controller.settings)
|
||||
}
|
||||
|
||||
Toast.success('Settings updated successfully')
|
||||
}
|
||||
|
||||
|
||||
@@ -26,7 +26,8 @@ export default class extends Controller {
|
||||
static values = {
|
||||
apiKey: String,
|
||||
startDate: String,
|
||||
endDate: String
|
||||
endDate: String,
|
||||
timezone: String
|
||||
}
|
||||
|
||||
static targets = [
|
||||
@@ -92,7 +93,7 @@ export default class extends Controller {
|
||||
|
||||
// Initialize managers
|
||||
this.layerManager = new LayerManager(this.map, this.settings, this.api)
|
||||
this.dataLoader = new DataLoader(this.api, this.apiKeyValue)
|
||||
this.dataLoader = new DataLoader(this.api, this.apiKeyValue, this.settings)
|
||||
this.eventHandlers = new EventHandlers(this.map, this)
|
||||
this.filterManager = new FilterManager(this.dataLoader)
|
||||
this.mapDataManager = new MapDataManager(this)
|
||||
|
||||
@@ -2220,6 +2220,7 @@ export default class extends BaseController {
|
||||
return;
|
||||
}
|
||||
|
||||
const timezone = this.timezone || 'UTC';
|
||||
const html = citiesData.map(country => `
|
||||
<div class="mb-4" style="min-width: min-content;">
|
||||
<h4 class="font-bold text-md">${country.country}</h4>
|
||||
@@ -2228,7 +2229,7 @@ export default class extends BaseController {
|
||||
<li class="text-sm whitespace-nowrap">
|
||||
${city.city}
|
||||
<span class="text-gray-500">
|
||||
(${new Date(city.timestamp * 1000).toLocaleDateString()})
|
||||
(${new Date(city.timestamp * 1000).toLocaleDateString('en-US', { timeZone: timezone })})
|
||||
</span>
|
||||
</li>
|
||||
`).join('')}
|
||||
|
||||
@@ -10,7 +10,8 @@ export default class extends BaseController {
|
||||
uuid: String,
|
||||
dataBounds: Object,
|
||||
hexagonsAvailable: Boolean,
|
||||
selfHosted: String
|
||||
selfHosted: String,
|
||||
timezone: String
|
||||
};
|
||||
|
||||
connect() {
|
||||
@@ -247,10 +248,11 @@ export default class extends BaseController {
|
||||
}
|
||||
|
||||
buildPopupContent(props) {
|
||||
const startDate = props.earliest_point ? new Date(props.earliest_point).toLocaleDateString() : 'N/A';
|
||||
const endDate = props.latest_point ? new Date(props.latest_point).toLocaleDateString() : 'N/A';
|
||||
const startTime = props.earliest_point ? new Date(props.earliest_point).toLocaleTimeString() : '';
|
||||
const endTime = props.latest_point ? new Date(props.latest_point).toLocaleTimeString() : '';
|
||||
const timezone = this.timezoneValue || 'UTC';
|
||||
const startDate = props.earliest_point ? new Date(props.earliest_point).toLocaleDateString('en-US', { timeZone: timezone }) : 'N/A';
|
||||
const endDate = props.latest_point ? new Date(props.latest_point).toLocaleDateString('en-US', { timeZone: timezone }) : 'N/A';
|
||||
const startTime = props.earliest_point ? new Date(props.earliest_point).toLocaleTimeString('en-US', { timeZone: timezone }) : '';
|
||||
const endTime = props.latest_point ? new Date(props.latest_point).toLocaleTimeString('en-US', { timeZone: timezone }) : '';
|
||||
|
||||
return `
|
||||
<div style="font-size: 12px; line-height: 1.6; max-width: 300px;">
|
||||
|
||||
@@ -1,11 +1,25 @@
|
||||
import { BaseLayer } from './base_layer'
|
||||
import { Toast } from 'maps_maplibre/components/toast'
|
||||
|
||||
/**
|
||||
* Points layer for displaying individual location points
|
||||
* Supports dragging points to update their positions
|
||||
*/
|
||||
export class PointsLayer extends BaseLayer {
|
||||
constructor(map, options = {}) {
|
||||
super(map, { id: 'points', ...options })
|
||||
this.apiClient = options.apiClient
|
||||
this.layerManager = options.layerManager
|
||||
this.isDragging = false
|
||||
this.draggedFeature = null
|
||||
this.canvas = null
|
||||
|
||||
// Bind event handlers once and store references for proper cleanup
|
||||
this._onMouseEnter = this.onMouseEnter.bind(this)
|
||||
this._onMouseLeave = this.onMouseLeave.bind(this)
|
||||
this._onMouseDown = this.onMouseDown.bind(this)
|
||||
this._onMouseMove = this.onMouseMove.bind(this)
|
||||
this._onMouseUp = this.onMouseUp.bind(this)
|
||||
}
|
||||
|
||||
getSourceConfig() {
|
||||
@@ -34,4 +48,218 @@ export class PointsLayer extends BaseLayer {
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
/**
|
||||
* Enable dragging for points
|
||||
*/
|
||||
enableDragging() {
|
||||
if (this.draggingEnabled) return
|
||||
|
||||
this.draggingEnabled = true
|
||||
this.canvas = this.map.getCanvasContainer()
|
||||
|
||||
// Change cursor to pointer when hovering over points
|
||||
this.map.on('mouseenter', this.id, this._onMouseEnter)
|
||||
this.map.on('mouseleave', this.id, this._onMouseLeave)
|
||||
|
||||
// Handle drag events
|
||||
this.map.on('mousedown', this.id, this._onMouseDown)
|
||||
}
|
||||
|
||||
/**
|
||||
* Disable dragging for points
|
||||
*/
|
||||
disableDragging() {
|
||||
if (!this.draggingEnabled) return
|
||||
|
||||
this.draggingEnabled = false
|
||||
|
||||
this.map.off('mouseenter', this.id, this._onMouseEnter)
|
||||
this.map.off('mouseleave', this.id, this._onMouseLeave)
|
||||
this.map.off('mousedown', this.id, this._onMouseDown)
|
||||
}
|
||||
|
||||
onMouseEnter() {
|
||||
this.canvas.style.cursor = 'move'
|
||||
}
|
||||
|
||||
onMouseLeave() {
|
||||
if (!this.isDragging) {
|
||||
this.canvas.style.cursor = ''
|
||||
}
|
||||
}
|
||||
|
||||
onMouseDown(e) {
|
||||
// Prevent default map drag behavior
|
||||
e.preventDefault()
|
||||
|
||||
// Store the feature being dragged
|
||||
this.draggedFeature = e.features[0]
|
||||
this.isDragging = true
|
||||
this.canvas.style.cursor = 'grabbing'
|
||||
|
||||
// Bind mouse move and up events
|
||||
this.map.on('mousemove', this._onMouseMove)
|
||||
this.map.once('mouseup', this._onMouseUp)
|
||||
}
|
||||
|
||||
onMouseMove(e) {
|
||||
if (!this.isDragging || !this.draggedFeature) return
|
||||
|
||||
// Get the new coordinates
|
||||
const coords = e.lngLat
|
||||
|
||||
// Update the feature's coordinates in the source
|
||||
const source = this.map.getSource(this.sourceId)
|
||||
if (source) {
|
||||
const data = source._data
|
||||
const feature = data.features.find(f => f.properties.id === this.draggedFeature.properties.id)
|
||||
if (feature) {
|
||||
feature.geometry.coordinates = [coords.lng, coords.lat]
|
||||
source.setData(data)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async onMouseUp(e) {
|
||||
if (!this.isDragging || !this.draggedFeature) return
|
||||
|
||||
const coords = e.lngLat
|
||||
const pointId = this.draggedFeature.properties.id
|
||||
const originalCoords = this.draggedFeature.geometry.coordinates
|
||||
|
||||
// Clean up drag state
|
||||
this.isDragging = false
|
||||
this.canvas.style.cursor = ''
|
||||
this.map.off('mousemove', this._onMouseMove)
|
||||
|
||||
// Update the point on the backend
|
||||
try {
|
||||
await this.updatePointPosition(pointId, coords.lat, coords.lng)
|
||||
|
||||
// Update routes after successful point update
|
||||
await this.updateConnectedRoutes(pointId, originalCoords, [coords.lng, coords.lat])
|
||||
} catch (error) {
|
||||
console.error('Failed to update point:', error)
|
||||
// Revert the point position on error
|
||||
const source = this.map.getSource(this.sourceId)
|
||||
if (source) {
|
||||
const data = source._data
|
||||
const feature = data.features.find(f => f.properties.id === pointId)
|
||||
if (feature && originalCoords) {
|
||||
feature.geometry.coordinates = originalCoords
|
||||
source.setData(data)
|
||||
}
|
||||
}
|
||||
Toast.error('Failed to update point position. Please try again.')
|
||||
}
|
||||
|
||||
this.draggedFeature = null
|
||||
}
|
||||
|
||||
/**
|
||||
* Update point position via API
|
||||
*/
|
||||
async updatePointPosition(pointId, latitude, longitude) {
|
||||
if (!this.apiClient) {
|
||||
throw new Error('API client not configured')
|
||||
}
|
||||
|
||||
const response = await fetch(`/api/v1/points/${pointId}`, {
|
||||
method: 'PATCH',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'Accept': 'application/json',
|
||||
'Authorization': `Bearer ${this.apiClient.apiKey}`
|
||||
},
|
||||
body: JSON.stringify({
|
||||
point: {
|
||||
latitude: latitude.toString(),
|
||||
longitude: longitude.toString()
|
||||
}
|
||||
})
|
||||
})
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(`HTTP error! status: ${response.status}`)
|
||||
}
|
||||
|
||||
return response.json()
|
||||
}
|
||||
|
||||
/**
|
||||
* Update connected route segments when a point is moved
|
||||
*/
|
||||
async updateConnectedRoutes(pointId, oldCoords, newCoords) {
|
||||
if (!this.layerManager) {
|
||||
console.warn('LayerManager not configured, cannot update routes')
|
||||
return
|
||||
}
|
||||
|
||||
const routesLayer = this.layerManager.getLayer('routes')
|
||||
if (!routesLayer) {
|
||||
console.warn('Routes layer not found')
|
||||
return
|
||||
}
|
||||
|
||||
const routesSource = this.map.getSource(routesLayer.sourceId)
|
||||
if (!routesSource) {
|
||||
console.warn('Routes source not found')
|
||||
return
|
||||
}
|
||||
|
||||
const routesData = routesSource._data
|
||||
if (!routesData || !routesData.features) {
|
||||
return
|
||||
}
|
||||
|
||||
// Tolerance for coordinate comparison (account for floating point precision)
|
||||
const tolerance = 0.0001
|
||||
let routesUpdated = false
|
||||
|
||||
// Find and update route segments that contain the moved point
|
||||
routesData.features.forEach(feature => {
|
||||
if (feature.geometry.type === 'LineString') {
|
||||
const coordinates = feature.geometry.coordinates
|
||||
|
||||
// Check each coordinate in the line
|
||||
for (let i = 0; i < coordinates.length; i++) {
|
||||
const coord = coordinates[i]
|
||||
|
||||
// Check if this coordinate matches the old position
|
||||
if (Math.abs(coord[0] - oldCoords[0]) < tolerance &&
|
||||
Math.abs(coord[1] - oldCoords[1]) < tolerance) {
|
||||
// Update to new position
|
||||
coordinates[i] = newCoords
|
||||
routesUpdated = true
|
||||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
|
||||
// Update the routes source if any routes were modified
|
||||
if (routesUpdated) {
|
||||
routesSource.setData(routesData)
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Override add method to enable dragging when layer is added
|
||||
*/
|
||||
add(data) {
|
||||
super.add(data)
|
||||
|
||||
// Wait for next tick to ensure layers are fully added before enabling dragging
|
||||
setTimeout(() => {
|
||||
this.enableDragging()
|
||||
}, 100)
|
||||
}
|
||||
|
||||
/**
|
||||
* Override remove method to clean up dragging handlers
|
||||
*/
|
||||
remove() {
|
||||
this.disableDragging()
|
||||
super.remove()
|
||||
}
|
||||
}
|
||||
|
||||
@@ -31,7 +31,13 @@ export class RoutesLayer extends BaseLayer {
|
||||
'line-cap': 'round'
|
||||
},
|
||||
paint: {
|
||||
'line-color': '#f97316', // Solid orange color
|
||||
// Use color from feature properties if available, otherwise default blue
|
||||
'line-color': [
|
||||
'case',
|
||||
['has', 'color'],
|
||||
['get', 'color'],
|
||||
'#0000ff' // Default blue color (matching v1)
|
||||
],
|
||||
'line-width': 3,
|
||||
'line-opacity': 0.8
|
||||
}
|
||||
|
||||
@@ -18,7 +18,8 @@ export class ApiClient {
|
||||
start_at,
|
||||
end_at,
|
||||
page: page.toString(),
|
||||
per_page: per_page.toString()
|
||||
per_page: per_page.toString(),
|
||||
slim: 'true'
|
||||
})
|
||||
|
||||
const response = await fetch(`${this.baseURL}/points?${params}`, {
|
||||
|
||||
@@ -28,9 +28,10 @@ export function pointsToGeoJSON(points) {
|
||||
/**
|
||||
* Format timestamp for display
|
||||
* @param {number|string} timestamp - Unix timestamp (seconds) or ISO 8601 string
|
||||
* @param {string} timezone - IANA timezone string (e.g., 'Europe/Berlin')
|
||||
* @returns {string} Formatted date/time
|
||||
*/
|
||||
export function formatTimestamp(timestamp) {
|
||||
export function formatTimestamp(timestamp, timezone = 'UTC') {
|
||||
// Handle different timestamp formats
|
||||
let date
|
||||
if (typeof timestamp === 'string') {
|
||||
@@ -49,6 +50,7 @@ export function formatTimestamp(timestamp) {
|
||||
month: 'short',
|
||||
day: 'numeric',
|
||||
hour: '2-digit',
|
||||
minute: '2-digit'
|
||||
minute: '2-digit',
|
||||
timeZone: timezone
|
||||
})
|
||||
}
|
||||
|
||||
@@ -1,21 +1,20 @@
|
||||
/**
|
||||
* Settings manager for persisting user preferences
|
||||
* Supports both localStorage (fallback) and backend API (primary)
|
||||
* Loads settings from backend API only (no localStorage)
|
||||
*/
|
||||
|
||||
const STORAGE_KEY = 'dawarich-maps-maplibre-settings'
|
||||
|
||||
const DEFAULT_SETTINGS = {
|
||||
mapStyle: 'light',
|
||||
enabledMapLayers: ['Points', 'Routes'], // Compatible with v1 map
|
||||
// Advanced settings
|
||||
routeOpacity: 1.0,
|
||||
fogOfWarRadius: 1000,
|
||||
// Advanced settings (matching v1 naming)
|
||||
routeOpacity: 0.6,
|
||||
fogOfWarRadius: 100,
|
||||
fogOfWarThreshold: 1,
|
||||
metersBetweenRoutes: 500,
|
||||
metersBetweenRoutes: 1000,
|
||||
minutesBetweenRoutes: 60,
|
||||
pointsRenderingMode: 'raw',
|
||||
speedColoredRoutes: false
|
||||
speedColoredRoutes: false,
|
||||
speedColorScale: '0:#00ff00|15:#00ffff|30:#ff00ff|50:#ffff00|100:#ff3300'
|
||||
}
|
||||
|
||||
// Mapping between v2 layer names and v1 layer names in enabled_map_layers array
|
||||
@@ -34,7 +33,15 @@ const LAYER_NAME_MAP = {
|
||||
// Mapping between frontend settings and backend API keys
|
||||
const BACKEND_SETTINGS_MAP = {
|
||||
mapStyle: 'maps_maplibre_style',
|
||||
enabledMapLayers: 'enabled_map_layers'
|
||||
enabledMapLayers: 'enabled_map_layers',
|
||||
routeOpacity: 'route_opacity',
|
||||
fogOfWarRadius: 'fog_of_war_meters',
|
||||
fogOfWarThreshold: 'fog_of_war_threshold',
|
||||
metersBetweenRoutes: 'meters_between_routes',
|
||||
minutesBetweenRoutes: 'minutes_between_routes',
|
||||
pointsRenderingMode: 'points_rendering_mode',
|
||||
speedColoredRoutes: 'speed_colored_routes',
|
||||
speedColorScale: 'speed_color_scale'
|
||||
}
|
||||
|
||||
export class SettingsManager {
|
||||
@@ -51,9 +58,8 @@ export class SettingsManager {
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all settings (localStorage first, then merge with defaults)
|
||||
* Get all settings from cache or defaults
|
||||
* Converts enabled_map_layers array to individual boolean flags
|
||||
* Uses cached settings if available to avoid race conditions
|
||||
* @returns {Object} Settings object
|
||||
*/
|
||||
static getSettings() {
|
||||
@@ -62,21 +68,11 @@ export class SettingsManager {
|
||||
return { ...this.cachedSettings }
|
||||
}
|
||||
|
||||
try {
|
||||
const stored = localStorage.getItem(STORAGE_KEY)
|
||||
const settings = stored ? { ...DEFAULT_SETTINGS, ...JSON.parse(stored) } : DEFAULT_SETTINGS
|
||||
// Convert enabled_map_layers array to individual boolean flags
|
||||
const expandedSettings = this._expandLayerSettings(DEFAULT_SETTINGS)
|
||||
this.cachedSettings = expandedSettings
|
||||
|
||||
// Convert enabled_map_layers array to individual boolean flags
|
||||
const expandedSettings = this._expandLayerSettings(settings)
|
||||
|
||||
// Cache the settings
|
||||
this.cachedSettings = expandedSettings
|
||||
|
||||
return { ...expandedSettings }
|
||||
} catch (error) {
|
||||
console.error('Failed to load settings:', error)
|
||||
return DEFAULT_SETTINGS
|
||||
}
|
||||
return { ...expandedSettings }
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -141,14 +137,31 @@ export class SettingsManager {
|
||||
const frontendSettings = {}
|
||||
Object.entries(BACKEND_SETTINGS_MAP).forEach(([frontendKey, backendKey]) => {
|
||||
if (backendKey in backendSettings) {
|
||||
frontendSettings[frontendKey] = backendSettings[backendKey]
|
||||
let value = backendSettings[backendKey]
|
||||
|
||||
// Convert backend values to correct types
|
||||
if (frontendKey === 'routeOpacity') {
|
||||
value = parseFloat(value) || DEFAULT_SETTINGS.routeOpacity
|
||||
} else if (frontendKey === 'fogOfWarRadius') {
|
||||
value = parseInt(value) || DEFAULT_SETTINGS.fogOfWarRadius
|
||||
} else if (frontendKey === 'fogOfWarThreshold') {
|
||||
value = parseInt(value) || DEFAULT_SETTINGS.fogOfWarThreshold
|
||||
} else if (frontendKey === 'metersBetweenRoutes') {
|
||||
value = parseInt(value) || DEFAULT_SETTINGS.metersBetweenRoutes
|
||||
} else if (frontendKey === 'minutesBetweenRoutes') {
|
||||
value = parseInt(value) || DEFAULT_SETTINGS.minutesBetweenRoutes
|
||||
} else if (frontendKey === 'speedColoredRoutes') {
|
||||
value = value === true || value === 'true'
|
||||
}
|
||||
|
||||
frontendSettings[frontendKey] = value
|
||||
}
|
||||
})
|
||||
|
||||
// Merge with defaults, but prioritize backend's enabled_map_layers completely
|
||||
// Merge with defaults
|
||||
const mergedSettings = { ...DEFAULT_SETTINGS, ...frontendSettings }
|
||||
|
||||
// If backend has enabled_map_layers, use it as-is (don't merge with defaults)
|
||||
// If backend has enabled_map_layers, use it as-is
|
||||
if (backendSettings.enabled_map_layers) {
|
||||
mergedSettings.enabledMapLayers = backendSettings.enabled_map_layers
|
||||
}
|
||||
@@ -156,8 +169,8 @@ export class SettingsManager {
|
||||
// Convert enabled_map_layers array to individual boolean flags
|
||||
const expandedSettings = this._expandLayerSettings(mergedSettings)
|
||||
|
||||
// Save to localStorage and cache
|
||||
this.saveToLocalStorage(expandedSettings)
|
||||
// Cache the settings
|
||||
this.cachedSettings = expandedSettings
|
||||
|
||||
return expandedSettings
|
||||
} catch (error) {
|
||||
@@ -167,18 +180,11 @@ export class SettingsManager {
|
||||
}
|
||||
|
||||
/**
|
||||
* Save all settings to localStorage and update cache
|
||||
* Update cache with new settings
|
||||
* @param {Object} settings - Settings object
|
||||
*/
|
||||
static saveToLocalStorage(settings) {
|
||||
try {
|
||||
// Update cache first
|
||||
this.cachedSettings = { ...settings }
|
||||
// Then save to localStorage
|
||||
localStorage.setItem(STORAGE_KEY, JSON.stringify(settings))
|
||||
} catch (error) {
|
||||
console.error('Failed to save settings to localStorage:', error)
|
||||
}
|
||||
static updateCache(settings) {
|
||||
this.cachedSettings = { ...settings }
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -203,7 +209,19 @@ export class SettingsManager {
|
||||
// Use the collapsed array
|
||||
backendSettings[backendKey] = enabledMapLayers
|
||||
} else if (frontendKey in settings) {
|
||||
backendSettings[backendKey] = settings[frontendKey]
|
||||
let value = settings[frontendKey]
|
||||
|
||||
// Convert frontend values to backend format
|
||||
if (frontendKey === 'routeOpacity') {
|
||||
value = parseFloat(value).toString()
|
||||
} else if (frontendKey === 'fogOfWarRadius' || frontendKey === 'fogOfWarThreshold' ||
|
||||
frontendKey === 'metersBetweenRoutes' || frontendKey === 'minutesBetweenRoutes') {
|
||||
value = parseInt(value).toString()
|
||||
} else if (frontendKey === 'speedColoredRoutes') {
|
||||
value = Boolean(value)
|
||||
}
|
||||
|
||||
backendSettings[backendKey] = value
|
||||
}
|
||||
})
|
||||
|
||||
@@ -220,7 +238,6 @@ export class SettingsManager {
|
||||
throw new Error(`Failed to save settings: ${response.status}`)
|
||||
}
|
||||
|
||||
console.log('[Settings] Saved to backend successfully:', backendSettings)
|
||||
return true
|
||||
} catch (error) {
|
||||
console.error('[Settings] Failed to save to backend:', error)
|
||||
@@ -238,7 +255,7 @@ export class SettingsManager {
|
||||
}
|
||||
|
||||
/**
|
||||
* Update a specific setting (saves to both localStorage and backend)
|
||||
* Update a specific setting and save to backend
|
||||
* @param {string} key - Setting key
|
||||
* @param {*} value - New value
|
||||
*/
|
||||
@@ -253,28 +270,23 @@ export class SettingsManager {
|
||||
settings.enabledMapLayers = this._collapseLayerSettings(settings)
|
||||
}
|
||||
|
||||
// Save to localStorage immediately
|
||||
this.saveToLocalStorage(settings)
|
||||
// Update cache immediately
|
||||
this.updateCache(settings)
|
||||
|
||||
// Save to backend (non-blocking)
|
||||
this.saveToBackend(settings).catch(error => {
|
||||
console.warn('[Settings] Backend save failed, but localStorage updated:', error)
|
||||
})
|
||||
// Save to backend
|
||||
await this.saveToBackend(settings)
|
||||
}
|
||||
|
||||
/**
|
||||
* Reset to defaults
|
||||
*/
|
||||
static resetToDefaults() {
|
||||
static async resetToDefaults() {
|
||||
try {
|
||||
localStorage.removeItem(STORAGE_KEY)
|
||||
this.cachedSettings = null // Clear cache
|
||||
|
||||
// Also reset on backend
|
||||
// Reset on backend
|
||||
if (this.apiKey) {
|
||||
this.saveToBackend(DEFAULT_SETTINGS).catch(error => {
|
||||
console.warn('[Settings] Failed to reset backend settings:', error)
|
||||
})
|
||||
await this.saveToBackend(DEFAULT_SETTINGS)
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Failed to reset settings:', error)
|
||||
@@ -282,9 +294,9 @@ export class SettingsManager {
|
||||
}
|
||||
|
||||
/**
|
||||
* Sync settings: load from backend and merge with localStorage
|
||||
* Sync settings: load from backend
|
||||
* Call this on app initialization
|
||||
* @returns {Promise<Object>} Merged settings
|
||||
* @returns {Promise<Object>} Settings from backend
|
||||
*/
|
||||
static async sync() {
|
||||
const backendSettings = await this.loadFromBackend()
|
||||
|
||||
@@ -102,7 +102,7 @@ function haversineDistance(lat1, lon1, lat2, lon2) {
|
||||
*/
|
||||
export function getSpeedColor(speedKmh, useSpeedColors, speedColorScale) {
|
||||
if (!useSpeedColors) {
|
||||
return '#f97316' // Default orange color
|
||||
return '#0000ff' // Default blue color (matching v1)
|
||||
}
|
||||
|
||||
let colorStops
|
||||
|
||||
@@ -4,6 +4,8 @@ class Family::Invitations::CleanupJob < ApplicationJob
|
||||
queue_as :families
|
||||
|
||||
def perform
|
||||
return unless DawarichSettings.family_feature_enabled?
|
||||
|
||||
Rails.logger.info 'Starting family invitations cleanup'
|
||||
|
||||
expired_count = Family::Invitation.where(status: :pending)
|
||||
|
||||
21
app/jobs/points/raw_data/archive_job.rb
Normal file
21
app/jobs/points/raw_data/archive_job.rb
Normal file
@@ -0,0 +1,21 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
module Points
|
||||
module RawData
|
||||
class ArchiveJob < ApplicationJob
|
||||
queue_as :archival
|
||||
|
||||
def perform
|
||||
return unless ENV['ARCHIVE_RAW_DATA'] == 'true'
|
||||
|
||||
stats = Points::RawData::Archiver.new.call
|
||||
|
||||
Rails.logger.info("Archive job complete: #{stats}")
|
||||
rescue StandardError => e
|
||||
ExceptionReporter.call(e, 'Points raw data archival job failed')
|
||||
|
||||
raise
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
19
app/jobs/points/raw_data/re_archive_month_job.rb
Normal file
19
app/jobs/points/raw_data/re_archive_month_job.rb
Normal file
@@ -0,0 +1,19 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
module Points
|
||||
module RawData
|
||||
class ReArchiveMonthJob < ApplicationJob
|
||||
queue_as :archival
|
||||
|
||||
def perform(user_id, year, month)
|
||||
Rails.logger.info("Re-archiving #{user_id}/#{year}/#{month} (retrospective import)")
|
||||
|
||||
Points::RawData::Archiver.new.archive_specific_month(user_id, year, month)
|
||||
rescue StandardError => e
|
||||
ExceptionReporter.call(e, "Re-archival job failed for #{user_id}/#{year}/#{month}")
|
||||
|
||||
raise
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
83
app/models/concerns/archivable.rb
Normal file
83
app/models/concerns/archivable.rb
Normal file
@@ -0,0 +1,83 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
module Archivable
|
||||
extend ActiveSupport::Concern
|
||||
|
||||
included do
|
||||
belongs_to :raw_data_archive,
|
||||
class_name: 'Points::RawDataArchive',
|
||||
optional: true
|
||||
|
||||
scope :archived, -> { where(raw_data_archived: true) }
|
||||
scope :not_archived, -> { where(raw_data_archived: false) }
|
||||
scope :with_archived_raw_data, lambda {
|
||||
includes(raw_data_archive: { file_attachment: :blob })
|
||||
}
|
||||
end
|
||||
|
||||
# Main method: Get raw_data with fallback to archive
|
||||
# Use this instead of point.raw_data when you need archived data
|
||||
def raw_data_with_archive
|
||||
return raw_data if raw_data.present? || !raw_data_archived?
|
||||
|
||||
fetch_archived_raw_data
|
||||
end
|
||||
|
||||
# Restore archived data back to database column
|
||||
def restore_raw_data!(value)
|
||||
update!(
|
||||
raw_data: value,
|
||||
raw_data_archived: false,
|
||||
raw_data_archive_id: nil
|
||||
)
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def fetch_archived_raw_data
|
||||
# Check temporary restore cache first (for migrations)
|
||||
cached = check_temporary_restore_cache
|
||||
return cached if cached
|
||||
|
||||
fetch_from_archive_file
|
||||
rescue StandardError => e
|
||||
handle_archive_fetch_error(e)
|
||||
end
|
||||
|
||||
def check_temporary_restore_cache
|
||||
return nil unless respond_to?(:timestamp)
|
||||
|
||||
recorded_time = Time.at(timestamp)
|
||||
cache_key = "raw_data:temp:#{user_id}:#{recorded_time.year}:#{recorded_time.month}:#{id}"
|
||||
Rails.cache.read(cache_key)
|
||||
end
|
||||
|
||||
def fetch_from_archive_file
|
||||
return {} unless raw_data_archive&.file&.attached?
|
||||
|
||||
# Download and search through JSONL
|
||||
compressed_content = raw_data_archive.file.blob.download
|
||||
io = StringIO.new(compressed_content)
|
||||
gz = Zlib::GzipReader.new(io)
|
||||
|
||||
begin
|
||||
result = nil
|
||||
gz.each_line do |line|
|
||||
data = JSON.parse(line)
|
||||
if data['id'] == id
|
||||
result = data['raw_data']
|
||||
break
|
||||
end
|
||||
end
|
||||
result || {}
|
||||
ensure
|
||||
gz.close
|
||||
end
|
||||
end
|
||||
|
||||
def handle_archive_fetch_error(error)
|
||||
ExceptionReporter.call(error, "Failed to fetch archived raw_data for Point ID #{id}")
|
||||
|
||||
{} # Graceful degradation
|
||||
end
|
||||
end
|
||||
@@ -8,18 +8,18 @@ module Taggable
|
||||
has_many :tags, through: :taggings
|
||||
|
||||
scope :with_tags, ->(tag_ids) { joins(:taggings).where(taggings: { tag_id: tag_ids }).distinct }
|
||||
scope :with_all_tags, ->(tag_ids) {
|
||||
tag_ids = Array(tag_ids)
|
||||
scope :with_all_tags, lambda { |tag_ids|
|
||||
tag_ids = Array(tag_ids).uniq
|
||||
return none if tag_ids.empty?
|
||||
|
||||
# For each tag, join and filter, then use HAVING to ensure all tags are present
|
||||
joins(:taggings)
|
||||
.where(taggings: { tag_id: tag_ids })
|
||||
.group("#{table_name}.id")
|
||||
.having("COUNT(DISTINCT taggings.tag_id) = ?", tag_ids.length)
|
||||
.having('COUNT(DISTINCT taggings.tag_id) = ?', tag_ids.length)
|
||||
}
|
||||
scope :without_tags, -> { left_joins(:taggings).where(taggings: { id: nil }) }
|
||||
scope :tagged_with, ->(tag_name, user) {
|
||||
scope :tagged_with, lambda { |tag_name, user|
|
||||
joins(:tags).where(tags: { name: tag_name, user: user }).distinct
|
||||
}
|
||||
end
|
||||
|
||||
@@ -3,6 +3,7 @@
|
||||
class Point < ApplicationRecord
|
||||
include Nearable
|
||||
include Distanceable
|
||||
include Archivable
|
||||
|
||||
belongs_to :import, optional: true, counter_cache: true
|
||||
belongs_to :visit, optional: true
|
||||
|
||||
40
app/models/points/raw_data_archive.rb
Normal file
40
app/models/points/raw_data_archive.rb
Normal file
@@ -0,0 +1,40 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
module Points
|
||||
class RawDataArchive < ApplicationRecord
|
||||
self.table_name = 'points_raw_data_archives'
|
||||
|
||||
belongs_to :user
|
||||
has_many :points, dependent: :nullify
|
||||
|
||||
has_one_attached :file
|
||||
|
||||
validates :year, :month, :chunk_number, :point_count, presence: true
|
||||
validates :year, numericality: { greater_than: 1970, less_than: 2100 }
|
||||
validates :month, numericality: { greater_than_or_equal_to: 1, less_than_or_equal_to: 12 }
|
||||
validates :chunk_number, numericality: { greater_than: 0 }
|
||||
validates :point_ids_checksum, presence: true
|
||||
|
||||
scope :for_month, lambda { |user_id, year, month|
|
||||
where(user_id: user_id, year: year, month: month)
|
||||
.order(:chunk_number)
|
||||
}
|
||||
|
||||
scope :recent, -> { where('archived_at > ?', 30.days.ago) }
|
||||
scope :old, -> { where('archived_at < ?', 1.year.ago) }
|
||||
|
||||
def month_display
|
||||
Date.new(year, month, 1).strftime('%B %Y')
|
||||
end
|
||||
|
||||
def filename
|
||||
"raw_data_archives/#{user_id}/#{year}/#{format('%02d', month)}/#{format('%03d', chunk_number)}.jsonl.gz"
|
||||
end
|
||||
|
||||
def size_mb
|
||||
return 0 unless file.attached?
|
||||
|
||||
(file.blob.byte_size / 1024.0 / 1024.0).round(2)
|
||||
end
|
||||
end
|
||||
end
|
||||
@@ -20,6 +20,7 @@ class User < ApplicationRecord # rubocop:disable Metrics/ClassLength
|
||||
has_many :tags, dependent: :destroy
|
||||
has_many :trips, dependent: :destroy
|
||||
has_many :tracks, dependent: :destroy
|
||||
has_many :raw_data_archives, class_name: 'Points::RawDataArchive', dependent: :destroy
|
||||
|
||||
after_create :create_api_key
|
||||
after_commit :activate, on: :create, if: -> { DawarichSettings.self_hosted? }
|
||||
|
||||
@@ -2,10 +2,14 @@
|
||||
|
||||
class ExceptionReporter
|
||||
def self.call(exception, human_message = 'Exception reported')
|
||||
return unless DawarichSettings.self_hosted?
|
||||
return if DawarichSettings.self_hosted?
|
||||
|
||||
Rails.logger.error "#{human_message}: #{exception.message}"
|
||||
|
||||
Sentry.capture_exception(exception)
|
||||
if exception.is_a?(Exception)
|
||||
Rails.logger.error "#{human_message}: #{exception.message}"
|
||||
Sentry.capture_exception(exception)
|
||||
else
|
||||
Rails.logger.error "#{exception}: #{human_message}"
|
||||
Sentry.capture_message("#{exception}: #{human_message}")
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
@@ -127,6 +127,15 @@ class Imports::SourceDetector
|
||||
else
|
||||
file_content
|
||||
end
|
||||
|
||||
# Check if it's a KMZ file (ZIP archive)
|
||||
if filename&.downcase&.end_with?('.kmz')
|
||||
# KMZ files are ZIP archives, check for ZIP signature
|
||||
# ZIP files start with "PK" (0x50 0x4B)
|
||||
return content_to_check[0..1] == 'PK'
|
||||
end
|
||||
|
||||
# For KML files, check XML structure
|
||||
(
|
||||
content_to_check.strip.start_with?('<?xml') ||
|
||||
content_to_check.strip.start_with?('<kml')
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
require 'rexml/document'
|
||||
require 'zip'
|
||||
|
||||
class Kml::Importer
|
||||
include Imports::Broadcaster
|
||||
@@ -15,149 +16,246 @@ class Kml::Importer
|
||||
end
|
||||
|
||||
def call
|
||||
file_content = load_file_content
|
||||
doc = REXML::Document.new(file_content)
|
||||
|
||||
points_data = []
|
||||
|
||||
# Process all Placemarks which can contain various geometry types
|
||||
REXML::XPath.each(doc, '//Placemark') do |placemark|
|
||||
points_data.concat(parse_placemark(placemark))
|
||||
end
|
||||
|
||||
# Process gx:Track elements (Google Earth extensions for GPS tracks)
|
||||
REXML::XPath.each(doc, '//gx:Track') do |track|
|
||||
points_data.concat(parse_gx_track(track))
|
||||
end
|
||||
|
||||
points_data.compact!
|
||||
doc = load_and_parse_kml_document
|
||||
points_data = extract_all_points(doc)
|
||||
|
||||
return if points_data.empty?
|
||||
|
||||
# Process in batches to avoid memory issues with large files
|
||||
save_points_in_batches(points_data)
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def load_and_parse_kml_document
|
||||
file_content = load_kml_content
|
||||
REXML::Document.new(file_content)
|
||||
end
|
||||
|
||||
def extract_all_points(doc)
|
||||
points_data = []
|
||||
points_data.concat(extract_points_from_placemarks(doc))
|
||||
points_data.concat(extract_points_from_gx_tracks(doc))
|
||||
points_data.compact
|
||||
end
|
||||
|
||||
def save_points_in_batches(points_data)
|
||||
points_data.each_slice(1000) do |batch|
|
||||
bulk_insert_points(batch)
|
||||
end
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def parse_placemark(placemark)
|
||||
def extract_points_from_placemarks(doc)
|
||||
points = []
|
||||
timestamp = extract_timestamp(placemark)
|
||||
|
||||
# Handle Point geometry
|
||||
point_node = REXML::XPath.first(placemark, './/Point/coordinates')
|
||||
if point_node
|
||||
coords = parse_coordinates(point_node.text)
|
||||
points << build_point(coords.first, timestamp, placemark) if coords.any?
|
||||
REXML::XPath.each(doc, '//Placemark') do |placemark|
|
||||
points.concat(parse_placemark(placemark))
|
||||
end
|
||||
points
|
||||
end
|
||||
|
||||
# Handle LineString geometry (tracks/routes)
|
||||
linestring_node = REXML::XPath.first(placemark, './/LineString/coordinates')
|
||||
if linestring_node
|
||||
coords = parse_coordinates(linestring_node.text)
|
||||
coords.each do |coord|
|
||||
points << build_point(coord, timestamp, placemark)
|
||||
def extract_points_from_gx_tracks(doc)
|
||||
points = []
|
||||
REXML::XPath.each(doc, '//gx:Track') do |track|
|
||||
points.concat(parse_gx_track(track))
|
||||
end
|
||||
points
|
||||
end
|
||||
|
||||
def load_kml_content
|
||||
content = read_file_content
|
||||
content = ensure_binary_encoding(content)
|
||||
kmz_file?(content) ? extract_kml_from_kmz(content) : content
|
||||
end
|
||||
|
||||
def read_file_content
|
||||
if file_path && File.exist?(file_path)
|
||||
File.binread(file_path)
|
||||
else
|
||||
download_and_read_content
|
||||
end
|
||||
end
|
||||
|
||||
def download_and_read_content
|
||||
downloader_content = Imports::SecureFileDownloader.new(import.file).download_with_verification
|
||||
downloader_content.is_a?(StringIO) ? downloader_content.read : downloader_content
|
||||
end
|
||||
|
||||
def ensure_binary_encoding(content)
|
||||
content.force_encoding('BINARY') if content.respond_to?(:force_encoding)
|
||||
content
|
||||
end
|
||||
|
||||
def kmz_file?(content)
|
||||
content[0..1] == 'PK'
|
||||
end
|
||||
|
||||
def extract_kml_from_kmz(kmz_content)
|
||||
kml_content = find_kml_in_zip(kmz_content)
|
||||
raise 'No KML file found in KMZ archive' unless kml_content
|
||||
|
||||
kml_content
|
||||
rescue Zip::Error => e
|
||||
raise "Failed to extract KML from KMZ: #{e.message}"
|
||||
end
|
||||
|
||||
def find_kml_in_zip(kmz_content)
|
||||
kml_content = nil
|
||||
|
||||
Zip::InputStream.open(StringIO.new(kmz_content)) do |io|
|
||||
while (entry = io.get_next_entry)
|
||||
if kml_entry?(entry)
|
||||
kml_content = io.read
|
||||
break
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
# Handle MultiGeometry (can contain multiple Points, LineStrings, etc.)
|
||||
kml_content
|
||||
end
|
||||
|
||||
def kml_entry?(entry)
|
||||
entry.name.downcase.end_with?('.kml')
|
||||
end
|
||||
|
||||
def parse_placemark(placemark)
|
||||
return [] unless has_explicit_timestamp?(placemark)
|
||||
|
||||
timestamp = extract_timestamp(placemark)
|
||||
points = []
|
||||
|
||||
points.concat(extract_point_geometry(placemark, timestamp))
|
||||
points.concat(extract_linestring_geometry(placemark, timestamp))
|
||||
points.concat(extract_multigeometry(placemark, timestamp))
|
||||
|
||||
points.compact
|
||||
end
|
||||
|
||||
def extract_point_geometry(placemark, timestamp)
|
||||
point_node = REXML::XPath.first(placemark, './/Point/coordinates')
|
||||
return [] unless point_node
|
||||
|
||||
coords = parse_coordinates(point_node.text)
|
||||
coords.any? ? [build_point(coords.first, timestamp, placemark)] : []
|
||||
end
|
||||
|
||||
def extract_linestring_geometry(placemark, timestamp)
|
||||
linestring_node = REXML::XPath.first(placemark, './/LineString/coordinates')
|
||||
return [] unless linestring_node
|
||||
|
||||
coords = parse_coordinates(linestring_node.text)
|
||||
coords.map { |coord| build_point(coord, timestamp, placemark) }
|
||||
end
|
||||
|
||||
def extract_multigeometry(placemark, timestamp)
|
||||
points = []
|
||||
REXML::XPath.each(placemark, './/MultiGeometry//coordinates') do |coords_node|
|
||||
coords = parse_coordinates(coords_node.text)
|
||||
coords.each do |coord|
|
||||
points << build_point(coord, timestamp, placemark)
|
||||
end
|
||||
end
|
||||
|
||||
points.compact
|
||||
points
|
||||
end
|
||||
|
||||
def parse_gx_track(track)
|
||||
# Google Earth Track extension with coordinated when/coord pairs
|
||||
points = []
|
||||
timestamps = extract_gx_timestamps(track)
|
||||
coordinates = extract_gx_coordinates(track)
|
||||
|
||||
build_gx_track_points(timestamps, coordinates)
|
||||
end
|
||||
|
||||
def extract_gx_timestamps(track)
|
||||
timestamps = []
|
||||
REXML::XPath.each(track, './/when') do |when_node|
|
||||
timestamps << when_node.text.strip
|
||||
end
|
||||
timestamps
|
||||
end
|
||||
|
||||
def extract_gx_coordinates(track)
|
||||
coordinates = []
|
||||
REXML::XPath.each(track, './/gx:coord') do |coord_node|
|
||||
coordinates << coord_node.text.strip
|
||||
end
|
||||
coordinates
|
||||
end
|
||||
|
||||
# Match timestamps with coordinates
|
||||
[timestamps.size, coordinates.size].min.times do |i|
|
||||
begin
|
||||
time = Time.parse(timestamps[i]).to_i
|
||||
coord_parts = coordinates[i].split(/\s+/)
|
||||
next if coord_parts.size < 2
|
||||
def build_gx_track_points(timestamps, coordinates)
|
||||
points = []
|
||||
min_size = [timestamps.size, coordinates.size].min
|
||||
|
||||
lng, lat, alt = coord_parts.map(&:to_f)
|
||||
|
||||
points << {
|
||||
lonlat: "POINT(#{lng} #{lat})",
|
||||
altitude: alt&.to_i || 0,
|
||||
timestamp: time,
|
||||
import_id: import.id,
|
||||
velocity: 0.0,
|
||||
raw_data: { source: 'gx_track', index: i },
|
||||
user_id: user_id,
|
||||
created_at: Time.current,
|
||||
updated_at: Time.current
|
||||
}
|
||||
rescue StandardError => e
|
||||
Rails.logger.warn("Failed to parse gx:Track point at index #{i}: #{e.message}")
|
||||
next
|
||||
end
|
||||
min_size.times do |i|
|
||||
point = build_gx_track_point(timestamps[i], coordinates[i], i)
|
||||
points << point if point
|
||||
end
|
||||
|
||||
points
|
||||
end
|
||||
|
||||
def build_gx_track_point(timestamp_str, coord_str, index)
|
||||
time = Time.parse(timestamp_str).to_i
|
||||
coord_parts = coord_str.split(/\s+/)
|
||||
return nil if coord_parts.size < 2
|
||||
|
||||
lng, lat, alt = coord_parts.map(&:to_f)
|
||||
|
||||
{
|
||||
lonlat: "POINT(#{lng} #{lat})",
|
||||
altitude: alt&.to_i || 0,
|
||||
timestamp: time,
|
||||
import_id: import.id,
|
||||
velocity: 0.0,
|
||||
raw_data: { source: 'gx_track', index: index },
|
||||
user_id: user_id,
|
||||
created_at: Time.current,
|
||||
updated_at: Time.current
|
||||
}
|
||||
rescue StandardError => e
|
||||
Rails.logger.warn("Failed to parse gx:Track point at index #{index}: #{e.message}")
|
||||
nil
|
||||
end
|
||||
|
||||
def parse_coordinates(coord_text)
|
||||
# KML coordinates format: "longitude,latitude[,altitude] ..."
|
||||
# Multiple coordinates separated by whitespace
|
||||
return [] if coord_text.blank?
|
||||
|
||||
coord_text.strip.split(/\s+/).map do |coord_str|
|
||||
parts = coord_str.split(',')
|
||||
next if parts.size < 2
|
||||
coord_text.strip.split(/\s+/).map { |coord_str| parse_single_coordinate(coord_str) }.compact
|
||||
end
|
||||
|
||||
{
|
||||
lng: parts[0].to_f,
|
||||
lat: parts[1].to_f,
|
||||
alt: parts[2]&.to_f || 0.0
|
||||
}
|
||||
end.compact
|
||||
def parse_single_coordinate(coord_str)
|
||||
parts = coord_str.split(',')
|
||||
return nil if parts.size < 2
|
||||
|
||||
{
|
||||
lng: parts[0].to_f,
|
||||
lat: parts[1].to_f,
|
||||
alt: parts[2]&.to_f || 0.0
|
||||
}
|
||||
end
|
||||
|
||||
def has_explicit_timestamp?(placemark)
|
||||
find_timestamp_node(placemark).present?
|
||||
end
|
||||
|
||||
def extract_timestamp(placemark)
|
||||
# Try TimeStamp first
|
||||
timestamp_node = REXML::XPath.first(placemark, './/TimeStamp/when')
|
||||
return Time.parse(timestamp_node.text).to_i if timestamp_node
|
||||
node = find_timestamp_node(placemark)
|
||||
raise 'No timestamp found in placemark' unless node
|
||||
|
||||
# Try TimeSpan begin
|
||||
timespan_begin = REXML::XPath.first(placemark, './/TimeSpan/begin')
|
||||
return Time.parse(timespan_begin.text).to_i if timespan_begin
|
||||
|
||||
# Try TimeSpan end as fallback
|
||||
timespan_end = REXML::XPath.first(placemark, './/TimeSpan/end')
|
||||
return Time.parse(timespan_end.text).to_i if timespan_end
|
||||
|
||||
# Default to import creation time if no timestamp found
|
||||
import.created_at.to_i
|
||||
Time.parse(node.text).to_i
|
||||
rescue StandardError => e
|
||||
Rails.logger.warn("Failed to parse timestamp: #{e.message}")
|
||||
import.created_at.to_i
|
||||
Rails.logger.error("Failed to parse timestamp: #{e.message}")
|
||||
raise e
|
||||
end
|
||||
|
||||
def find_timestamp_node(placemark)
|
||||
REXML::XPath.first(placemark, './/TimeStamp/when') ||
|
||||
REXML::XPath.first(placemark, './/TimeSpan/begin') ||
|
||||
REXML::XPath.first(placemark, './/TimeSpan/end')
|
||||
end
|
||||
|
||||
def build_point(coord, timestamp, placemark)
|
||||
return if coord[:lat].blank? || coord[:lng].blank?
|
||||
return if invalid_coordinates?(coord)
|
||||
|
||||
{
|
||||
lonlat: "POINT(#{coord[:lng]} #{coord[:lat]})",
|
||||
lonlat: format_point_geometry(coord),
|
||||
altitude: coord[:alt].to_i,
|
||||
timestamp: timestamp,
|
||||
import_id: import.id,
|
||||
@@ -169,31 +267,52 @@ class Kml::Importer
|
||||
}
|
||||
end
|
||||
|
||||
def invalid_coordinates?(coord)
|
||||
coord[:lat].blank? || coord[:lng].blank?
|
||||
end
|
||||
|
||||
def format_point_geometry(coord)
|
||||
"POINT(#{coord[:lng]} #{coord[:lat]})"
|
||||
end
|
||||
|
||||
def extract_velocity(placemark)
|
||||
# Try to extract speed from ExtendedData
|
||||
speed_node = REXML::XPath.first(placemark, ".//Data[@name='speed']/value") ||
|
||||
REXML::XPath.first(placemark, ".//Data[@name='Speed']/value") ||
|
||||
REXML::XPath.first(placemark, ".//Data[@name='velocity']/value")
|
||||
|
||||
return speed_node.text.to_f.round(1) if speed_node
|
||||
|
||||
0.0
|
||||
speed_node = find_speed_node(placemark)
|
||||
speed_node ? speed_node.text.to_f.round(1) : 0.0
|
||||
rescue StandardError
|
||||
0.0
|
||||
end
|
||||
|
||||
def find_speed_node(placemark)
|
||||
REXML::XPath.first(placemark, ".//Data[@name='speed']/value") ||
|
||||
REXML::XPath.first(placemark, ".//Data[@name='Speed']/value") ||
|
||||
REXML::XPath.first(placemark, ".//Data[@name='velocity']/value")
|
||||
end
|
||||
|
||||
def extract_extended_data(placemark)
|
||||
data = {}
|
||||
data.merge!(extract_name_and_description(placemark))
|
||||
data.merge!(extract_custom_data_fields(placemark))
|
||||
data
|
||||
rescue StandardError => e
|
||||
Rails.logger.warn("Failed to extract extended data: #{e.message}")
|
||||
{}
|
||||
end
|
||||
|
||||
def extract_name_and_description(placemark)
|
||||
data = {}
|
||||
|
||||
# Extract name if present
|
||||
name_node = REXML::XPath.first(placemark, './/name')
|
||||
data['name'] = name_node.text.strip if name_node
|
||||
|
||||
# Extract description if present
|
||||
desc_node = REXML::XPath.first(placemark, './/description')
|
||||
data['description'] = desc_node.text.strip if desc_node
|
||||
|
||||
# Extract all ExtendedData/Data elements
|
||||
data
|
||||
end
|
||||
|
||||
def extract_custom_data_fields(placemark)
|
||||
data = {}
|
||||
|
||||
REXML::XPath.each(placemark, './/ExtendedData/Data') do |data_node|
|
||||
name = data_node.attributes['name']
|
||||
value_node = REXML::XPath.first(data_node, './value')
|
||||
@@ -201,26 +320,29 @@ class Kml::Importer
|
||||
end
|
||||
|
||||
data
|
||||
rescue StandardError => e
|
||||
Rails.logger.warn("Failed to extract extended data: #{e.message}")
|
||||
{}
|
||||
end
|
||||
|
||||
def bulk_insert_points(batch)
|
||||
unique_batch = batch.uniq { |record| [record[:lonlat], record[:timestamp], record[:user_id]] }
|
||||
unique_batch = deduplicate_batch(batch)
|
||||
upsert_points(unique_batch)
|
||||
broadcast_import_progress(import, unique_batch.size)
|
||||
rescue StandardError => e
|
||||
create_notification("Failed to process KML file: #{e.message}")
|
||||
end
|
||||
|
||||
def deduplicate_batch(batch)
|
||||
batch.uniq { |record| [record[:lonlat], record[:timestamp], record[:user_id]] }
|
||||
end
|
||||
|
||||
def upsert_points(batch)
|
||||
# rubocop:disable Rails/SkipsModelValidations
|
||||
Point.upsert_all(
|
||||
unique_batch,
|
||||
batch,
|
||||
unique_by: %i[lonlat timestamp user_id],
|
||||
returning: false,
|
||||
on_duplicate: :skip
|
||||
)
|
||||
# rubocop:enable Rails/SkipsModelValidations
|
||||
|
||||
broadcast_import_progress(import, unique_batch.size)
|
||||
rescue StandardError => e
|
||||
create_notification("Failed to process KML file: #{e.message}")
|
||||
end
|
||||
|
||||
def create_notification(message)
|
||||
|
||||
184
app/services/points/raw_data/archiver.rb
Normal file
184
app/services/points/raw_data/archiver.rb
Normal file
@@ -0,0 +1,184 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
module Points
|
||||
module RawData
|
||||
class Archiver
|
||||
SAFE_ARCHIVE_LAG = 2.months
|
||||
|
||||
def initialize
|
||||
@stats = { processed: 0, archived: 0, failed: 0 }
|
||||
end
|
||||
|
||||
def call
|
||||
unless archival_enabled?
|
||||
Rails.logger.info('Raw data archival disabled (ARCHIVE_RAW_DATA != "true")')
|
||||
return @stats
|
||||
end
|
||||
|
||||
Rails.logger.info('Starting points raw_data archival...')
|
||||
|
||||
archivable_months.each do |month_data|
|
||||
process_month(month_data)
|
||||
end
|
||||
|
||||
Rails.logger.info("Archival complete: #{@stats}")
|
||||
@stats
|
||||
end
|
||||
|
||||
def archive_specific_month(user_id, year, month)
|
||||
month_data = {
|
||||
'user_id' => user_id,
|
||||
'year' => year,
|
||||
'month' => month
|
||||
}
|
||||
|
||||
process_month(month_data)
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def archival_enabled?
|
||||
ENV['ARCHIVE_RAW_DATA'] == 'true'
|
||||
end
|
||||
|
||||
def archivable_months
|
||||
# Only months 2+ months old with unarchived points
|
||||
safe_cutoff = Date.current.beginning_of_month - SAFE_ARCHIVE_LAG
|
||||
|
||||
# Use raw SQL to avoid GROUP BY issues with ActiveRecord
|
||||
# Use AT TIME ZONE 'UTC' to ensure consistent timezone handling
|
||||
sql = <<-SQL.squish
|
||||
SELECT user_id,
|
||||
EXTRACT(YEAR FROM (to_timestamp(timestamp) AT TIME ZONE 'UTC'))::int as year,
|
||||
EXTRACT(MONTH FROM (to_timestamp(timestamp) AT TIME ZONE 'UTC'))::int as month,
|
||||
COUNT(*) as unarchived_count
|
||||
FROM points
|
||||
WHERE raw_data_archived = false
|
||||
AND raw_data IS NOT NULL
|
||||
AND raw_data != '{}'
|
||||
AND to_timestamp(timestamp) < ?
|
||||
GROUP BY user_id,
|
||||
EXTRACT(YEAR FROM (to_timestamp(timestamp) AT TIME ZONE 'UTC')),
|
||||
EXTRACT(MONTH FROM (to_timestamp(timestamp) AT TIME ZONE 'UTC'))
|
||||
SQL
|
||||
|
||||
ActiveRecord::Base.connection.exec_query(
|
||||
ActiveRecord::Base.sanitize_sql_array([sql, safe_cutoff])
|
||||
)
|
||||
end
|
||||
|
||||
def process_month(month_data)
|
||||
user_id = month_data['user_id']
|
||||
year = month_data['year']
|
||||
month = month_data['month']
|
||||
|
||||
lock_key = "archive_points:#{user_id}:#{year}:#{month}"
|
||||
|
||||
# Advisory lock prevents duplicate processing
|
||||
# Returns false if lock couldn't be acquired (already locked)
|
||||
lock_acquired = ActiveRecord::Base.with_advisory_lock(lock_key, timeout_seconds: 0) do
|
||||
archive_month(user_id, year, month)
|
||||
@stats[:processed] += 1
|
||||
true
|
||||
end
|
||||
|
||||
Rails.logger.info("Skipping #{lock_key} - already locked") unless lock_acquired
|
||||
rescue StandardError => e
|
||||
ExceptionReporter.call(e, "Failed to archive points for user #{user_id}, #{year}-#{month}")
|
||||
|
||||
@stats[:failed] += 1
|
||||
end
|
||||
|
||||
def archive_month(user_id, year, month)
|
||||
points = find_archivable_points(user_id, year, month)
|
||||
return if points.empty?
|
||||
|
||||
point_ids = points.pluck(:id)
|
||||
log_archival_start(user_id, year, month, point_ids.count)
|
||||
|
||||
archive = create_archive_chunk(user_id, year, month, points, point_ids)
|
||||
mark_points_as_archived(point_ids, archive.id)
|
||||
update_stats(point_ids.count)
|
||||
log_archival_success(archive)
|
||||
end
|
||||
|
||||
def find_archivable_points(user_id, year, month)
|
||||
timestamp_range = month_timestamp_range(year, month)
|
||||
|
||||
Point.where(user_id: user_id, raw_data_archived: false)
|
||||
.where(timestamp: timestamp_range)
|
||||
.where.not(raw_data: nil)
|
||||
.where.not(raw_data: '{}')
|
||||
end
|
||||
|
||||
def month_timestamp_range(year, month)
|
||||
start_of_month = Time.utc(year, month, 1).to_i
|
||||
end_of_month = (Time.utc(year, month, 1) + 1.month).to_i
|
||||
start_of_month...end_of_month
|
||||
end
|
||||
|
||||
def mark_points_as_archived(point_ids, archive_id)
|
||||
Point.transaction do
|
||||
Point.where(id: point_ids).update_all(
|
||||
raw_data_archived: true,
|
||||
raw_data_archive_id: archive_id
|
||||
)
|
||||
end
|
||||
end
|
||||
|
||||
def update_stats(archived_count)
|
||||
@stats[:archived] += archived_count
|
||||
end
|
||||
|
||||
def log_archival_start(user_id, year, month, count)
|
||||
Rails.logger.info("Archiving #{count} points for user #{user_id}, #{year}-#{format('%02d', month)}")
|
||||
end
|
||||
|
||||
def log_archival_success(archive)
|
||||
Rails.logger.info("✓ Archived chunk #{archive.chunk_number} (#{archive.size_mb} MB)")
|
||||
end
|
||||
|
||||
def create_archive_chunk(user_id, year, month, points, point_ids)
|
||||
# Determine chunk number (append-only)
|
||||
chunk_number = Points::RawDataArchive
|
||||
.where(user_id: user_id, year: year, month: month)
|
||||
.maximum(:chunk_number).to_i + 1
|
||||
|
||||
# Compress points data
|
||||
compressed_data = Points::RawData::ChunkCompressor.new(points).compress
|
||||
|
||||
# Create archive record
|
||||
archive = Points::RawDataArchive.create!(
|
||||
user_id: user_id,
|
||||
year: year,
|
||||
month: month,
|
||||
chunk_number: chunk_number,
|
||||
point_count: point_ids.count,
|
||||
point_ids_checksum: calculate_checksum(point_ids),
|
||||
archived_at: Time.current,
|
||||
metadata: {
|
||||
format_version: 1,
|
||||
compression: 'gzip',
|
||||
archived_by: 'Points::RawData::Archiver'
|
||||
}
|
||||
)
|
||||
|
||||
# Attach compressed file via ActiveStorage
|
||||
# Uses directory structure: raw_data_archives/:user_id/:year/:month/:chunk.jsonl.gz
|
||||
# The key parameter controls the actual storage path
|
||||
archive.file.attach(
|
||||
io: StringIO.new(compressed_data),
|
||||
filename: "#{format('%03d', chunk_number)}.jsonl.gz",
|
||||
content_type: 'application/gzip',
|
||||
key: "raw_data_archives/#{user_id}/#{year}/#{format('%02d', month)}/#{format('%03d', chunk_number)}.jsonl.gz"
|
||||
)
|
||||
|
||||
archive
|
||||
end
|
||||
|
||||
def calculate_checksum(point_ids)
|
||||
Digest::SHA256.hexdigest(point_ids.sort.join(','))
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
25
app/services/points/raw_data/chunk_compressor.rb
Normal file
25
app/services/points/raw_data/chunk_compressor.rb
Normal file
@@ -0,0 +1,25 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
module Points
|
||||
module RawData
|
||||
class ChunkCompressor
|
||||
def initialize(points_relation)
|
||||
@points = points_relation
|
||||
end
|
||||
|
||||
def compress
|
||||
io = StringIO.new
|
||||
gz = Zlib::GzipWriter.new(io)
|
||||
|
||||
# Stream points to avoid memory issues with large months
|
||||
@points.select(:id, :raw_data).find_each(batch_size: 1000) do |point|
|
||||
# Write as JSONL (one JSON object per line)
|
||||
gz.puts({ id: point.id, raw_data: point.raw_data }.to_json)
|
||||
end
|
||||
|
||||
gz.close
|
||||
io.string.force_encoding(Encoding::ASCII_8BIT) # Returns compressed bytes in binary encoding
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
96
app/services/points/raw_data/clearer.rb
Normal file
96
app/services/points/raw_data/clearer.rb
Normal file
@@ -0,0 +1,96 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
module Points
|
||||
module RawData
|
||||
class Clearer
|
||||
BATCH_SIZE = 10_000
|
||||
|
||||
def initialize
|
||||
@stats = { cleared: 0, skipped: 0 }
|
||||
end
|
||||
|
||||
def call
|
||||
Rails.logger.info('Starting raw_data clearing for verified archives...')
|
||||
|
||||
verified_archives.find_each do |archive|
|
||||
clear_archive_points(archive)
|
||||
end
|
||||
|
||||
Rails.logger.info("Clearing complete: #{@stats}")
|
||||
@stats
|
||||
end
|
||||
|
||||
def clear_specific_archive(archive_id)
|
||||
archive = Points::RawDataArchive.find(archive_id)
|
||||
|
||||
unless archive.verified_at.present?
|
||||
Rails.logger.warn("Archive #{archive_id} not verified, skipping clear")
|
||||
return { cleared: 0, skipped: 0 }
|
||||
end
|
||||
|
||||
clear_archive_points(archive)
|
||||
end
|
||||
|
||||
def clear_month(user_id, year, month)
|
||||
archives = Points::RawDataArchive.for_month(user_id, year, month)
|
||||
.where.not(verified_at: nil)
|
||||
|
||||
Rails.logger.info("Clearing #{archives.count} verified archives for #{year}-#{format('%02d', month)}...")
|
||||
|
||||
archives.each { |archive| clear_archive_points(archive) }
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def verified_archives
|
||||
# Only archives that are verified but have points with non-empty raw_data
|
||||
Points::RawDataArchive
|
||||
.where.not(verified_at: nil)
|
||||
.where(id: points_needing_clearing.select(:raw_data_archive_id).distinct)
|
||||
end
|
||||
|
||||
def points_needing_clearing
|
||||
Point.where(raw_data_archived: true)
|
||||
.where.not(raw_data: {})
|
||||
.where.not(raw_data_archive_id: nil)
|
||||
end
|
||||
|
||||
def clear_archive_points(archive)
|
||||
Rails.logger.info(
|
||||
"Clearing points for archive #{archive.id} " \
|
||||
"(#{archive.month_display}, chunk #{archive.chunk_number})..."
|
||||
)
|
||||
|
||||
point_ids = Point.where(raw_data_archive_id: archive.id)
|
||||
.where(raw_data_archived: true)
|
||||
.where.not(raw_data: {})
|
||||
.pluck(:id)
|
||||
|
||||
if point_ids.empty?
|
||||
Rails.logger.info("No points to clear for archive #{archive.id}")
|
||||
return
|
||||
end
|
||||
|
||||
cleared_count = clear_points_in_batches(point_ids)
|
||||
@stats[:cleared] += cleared_count
|
||||
Rails.logger.info("✓ Cleared #{cleared_count} points for archive #{archive.id}")
|
||||
rescue StandardError => e
|
||||
ExceptionReporter.call(e, "Failed to clear points for archive #{archive.id}")
|
||||
Rails.logger.error("✗ Failed to clear archive #{archive.id}: #{e.message}")
|
||||
end
|
||||
|
||||
def clear_points_in_batches(point_ids)
|
||||
total_cleared = 0
|
||||
|
||||
point_ids.each_slice(BATCH_SIZE) do |batch|
|
||||
Point.transaction do
|
||||
Point.where(id: batch).update_all(raw_data: {})
|
||||
total_cleared += batch.size
|
||||
end
|
||||
end
|
||||
|
||||
total_cleared
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
105
app/services/points/raw_data/restorer.rb
Normal file
105
app/services/points/raw_data/restorer.rb
Normal file
@@ -0,0 +1,105 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
module Points
|
||||
module RawData
|
||||
class Restorer
|
||||
def restore_to_database(user_id, year, month)
|
||||
archives = Points::RawDataArchive.for_month(user_id, year, month)
|
||||
|
||||
raise "No archives found for user #{user_id}, #{year}-#{month}" if archives.empty?
|
||||
|
||||
Rails.logger.info("Restoring #{archives.count} archives to database...")
|
||||
|
||||
Point.transaction do
|
||||
archives.each { restore_archive_to_db(_1) }
|
||||
end
|
||||
|
||||
Rails.logger.info("✓ Restored #{archives.sum(:point_count)} points")
|
||||
end
|
||||
|
||||
def restore_to_memory(user_id, year, month)
|
||||
archives = Points::RawDataArchive.for_month(user_id, year, month)
|
||||
|
||||
raise "No archives found for user #{user_id}, #{year}-#{month}" if archives.empty?
|
||||
|
||||
Rails.logger.info("Loading #{archives.count} archives into cache...")
|
||||
|
||||
cache_key_prefix = "raw_data:temp:#{user_id}:#{year}:#{month}"
|
||||
count = 0
|
||||
|
||||
archives.each do |archive|
|
||||
count += restore_archive_to_cache(archive, cache_key_prefix)
|
||||
end
|
||||
|
||||
Rails.logger.info("✓ Loaded #{count} points into cache (expires in 1 hour)")
|
||||
end
|
||||
|
||||
def restore_all_for_user(user_id)
|
||||
archives =
|
||||
Points::RawDataArchive.where(user_id: user_id)
|
||||
.select(:year, :month)
|
||||
.distinct
|
||||
.order(:year, :month)
|
||||
|
||||
Rails.logger.info("Restoring #{archives.count} months for user #{user_id}...")
|
||||
|
||||
archives.each do |archive|
|
||||
restore_to_database(user_id, archive.year, archive.month)
|
||||
end
|
||||
|
||||
Rails.logger.info('✓ Complete user restore finished')
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def restore_archive_to_db(archive)
|
||||
decompressed = download_and_decompress(archive)
|
||||
|
||||
decompressed.each_line do |line|
|
||||
data = JSON.parse(line)
|
||||
|
||||
Point.where(id: data['id']).update_all(
|
||||
raw_data: data['raw_data'],
|
||||
raw_data_archived: false,
|
||||
raw_data_archive_id: nil
|
||||
)
|
||||
end
|
||||
end
|
||||
|
||||
def restore_archive_to_cache(archive, cache_key_prefix)
|
||||
decompressed = download_and_decompress(archive)
|
||||
count = 0
|
||||
|
||||
decompressed.each_line do |line|
|
||||
data = JSON.parse(line)
|
||||
|
||||
Rails.cache.write(
|
||||
"#{cache_key_prefix}:#{data['id']}",
|
||||
data['raw_data'],
|
||||
expires_in: 1.hour
|
||||
)
|
||||
|
||||
count += 1
|
||||
end
|
||||
|
||||
count
|
||||
end
|
||||
|
||||
def download_and_decompress(archive)
|
||||
# Download via ActiveStorage
|
||||
compressed_content = archive.file.blob.download
|
||||
|
||||
# Decompress
|
||||
io = StringIO.new(compressed_content)
|
||||
gz = Zlib::GzipReader.new(io)
|
||||
content = gz.read
|
||||
gz.close
|
||||
|
||||
content
|
||||
rescue StandardError => e
|
||||
Rails.logger.error("Failed to download/decompress archive #{archive.id}: #{e.message}")
|
||||
raise
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
194
app/services/points/raw_data/verifier.rb
Normal file
194
app/services/points/raw_data/verifier.rb
Normal file
@@ -0,0 +1,194 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
module Points
|
||||
module RawData
|
||||
class Verifier
|
||||
def initialize
|
||||
@stats = { verified: 0, failed: 0 }
|
||||
end
|
||||
|
||||
def call
|
||||
Rails.logger.info('Starting raw_data archive verification...')
|
||||
|
||||
unverified_archives.find_each do |archive|
|
||||
verify_archive(archive)
|
||||
end
|
||||
|
||||
Rails.logger.info("Verification complete: #{@stats}")
|
||||
@stats
|
||||
end
|
||||
|
||||
def verify_specific_archive(archive_id)
|
||||
archive = Points::RawDataArchive.find(archive_id)
|
||||
verify_archive(archive)
|
||||
end
|
||||
|
||||
def verify_month(user_id, year, month)
|
||||
archives = Points::RawDataArchive.for_month(user_id, year, month)
|
||||
.where(verified_at: nil)
|
||||
|
||||
Rails.logger.info("Verifying #{archives.count} archives for #{year}-#{format('%02d', month)}...")
|
||||
|
||||
archives.each { |archive| verify_archive(archive) }
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def unverified_archives
|
||||
Points::RawDataArchive.where(verified_at: nil)
|
||||
end
|
||||
|
||||
def verify_archive(archive)
|
||||
Rails.logger.info("Verifying archive #{archive.id} (#{archive.month_display}, chunk #{archive.chunk_number})...")
|
||||
|
||||
verification_result = perform_verification(archive)
|
||||
|
||||
if verification_result[:success]
|
||||
archive.update!(verified_at: Time.current)
|
||||
@stats[:verified] += 1
|
||||
Rails.logger.info("✓ Archive #{archive.id} verified successfully")
|
||||
else
|
||||
@stats[:failed] += 1
|
||||
Rails.logger.error("✗ Archive #{archive.id} verification failed: #{verification_result[:error]}")
|
||||
ExceptionReporter.call(
|
||||
StandardError.new(verification_result[:error]),
|
||||
"Archive verification failed for archive #{archive.id}"
|
||||
)
|
||||
end
|
||||
rescue StandardError => e
|
||||
@stats[:failed] += 1
|
||||
ExceptionReporter.call(e, "Failed to verify archive #{archive.id}")
|
||||
Rails.logger.error("✗ Archive #{archive.id} verification error: #{e.message}")
|
||||
end
|
||||
|
||||
def perform_verification(archive)
|
||||
# 1. Verify file exists and is attached
|
||||
unless archive.file.attached?
|
||||
return { success: false, error: 'File not attached' }
|
||||
end
|
||||
|
||||
# 2. Verify file can be downloaded
|
||||
begin
|
||||
compressed_content = archive.file.blob.download
|
||||
rescue StandardError => e
|
||||
return { success: false, error: "File download failed: #{e.message}" }
|
||||
end
|
||||
|
||||
# 3. Verify file size is reasonable
|
||||
if compressed_content.bytesize.zero?
|
||||
return { success: false, error: 'File is empty' }
|
||||
end
|
||||
|
||||
# 4. Verify MD5 checksum (if blob has checksum)
|
||||
if archive.file.blob.checksum.present?
|
||||
calculated_checksum = Digest::MD5.base64digest(compressed_content)
|
||||
if calculated_checksum != archive.file.blob.checksum
|
||||
return { success: false, error: 'MD5 checksum mismatch' }
|
||||
end
|
||||
end
|
||||
|
||||
# 5. Verify file can be decompressed and is valid JSONL, extract data
|
||||
begin
|
||||
archived_data = decompress_and_extract_data(compressed_content)
|
||||
rescue StandardError => e
|
||||
return { success: false, error: "Decompression/parsing failed: #{e.message}" }
|
||||
end
|
||||
|
||||
point_ids = archived_data.keys
|
||||
|
||||
# 6. Verify point count matches
|
||||
if point_ids.count != archive.point_count
|
||||
return {
|
||||
success: false,
|
||||
error: "Point count mismatch: expected #{archive.point_count}, found #{point_ids.count}"
|
||||
}
|
||||
end
|
||||
|
||||
# 7. Verify point IDs checksum matches
|
||||
calculated_checksum = calculate_checksum(point_ids)
|
||||
if calculated_checksum != archive.point_ids_checksum
|
||||
return { success: false, error: 'Point IDs checksum mismatch' }
|
||||
end
|
||||
|
||||
# 8. Verify all points still exist in database
|
||||
existing_count = Point.where(id: point_ids).count
|
||||
if existing_count != point_ids.count
|
||||
return {
|
||||
success: false,
|
||||
error: "Missing points in database: expected #{point_ids.count}, found #{existing_count}"
|
||||
}
|
||||
end
|
||||
|
||||
# 9. Verify archived raw_data matches current database raw_data
|
||||
verification_result = verify_raw_data_matches(archived_data)
|
||||
return verification_result unless verification_result[:success]
|
||||
|
||||
{ success: true }
|
||||
end
|
||||
|
||||
def decompress_and_extract_data(compressed_content)
|
||||
io = StringIO.new(compressed_content)
|
||||
gz = Zlib::GzipReader.new(io)
|
||||
archived_data = {}
|
||||
|
||||
gz.each_line do |line|
|
||||
data = JSON.parse(line)
|
||||
archived_data[data['id']] = data['raw_data']
|
||||
end
|
||||
|
||||
gz.close
|
||||
archived_data
|
||||
end
|
||||
|
||||
def verify_raw_data_matches(archived_data)
|
||||
# For small archives, verify all points. For large archives, sample up to 100 points.
|
||||
# Always verify all if 100 or fewer points for maximum accuracy
|
||||
if archived_data.size <= 100
|
||||
point_ids_to_check = archived_data.keys
|
||||
else
|
||||
point_ids_to_check = archived_data.keys.sample(100)
|
||||
end
|
||||
|
||||
mismatches = []
|
||||
found_points = 0
|
||||
|
||||
Point.where(id: point_ids_to_check).find_each do |point|
|
||||
found_points += 1
|
||||
archived_raw_data = archived_data[point.id]
|
||||
current_raw_data = point.raw_data
|
||||
|
||||
# Compare the raw_data (both should be hashes)
|
||||
if archived_raw_data != current_raw_data
|
||||
mismatches << {
|
||||
point_id: point.id,
|
||||
archived: archived_raw_data,
|
||||
current: current_raw_data
|
||||
}
|
||||
end
|
||||
end
|
||||
|
||||
# Check if we found all the points we were looking for
|
||||
if found_points != point_ids_to_check.size
|
||||
return {
|
||||
success: false,
|
||||
error: "Missing points during data verification: expected #{point_ids_to_check.size}, found #{found_points}"
|
||||
}
|
||||
end
|
||||
|
||||
if mismatches.any?
|
||||
return {
|
||||
success: false,
|
||||
error: "Raw data mismatch detected in #{mismatches.count} point(s). " \
|
||||
"First mismatch: Point #{mismatches.first[:point_id]}"
|
||||
}
|
||||
end
|
||||
|
||||
{ success: true }
|
||||
end
|
||||
|
||||
def calculate_checksum(point_ids)
|
||||
Digest::SHA256.hexdigest(point_ids.sort.join(','))
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
@@ -66,8 +66,7 @@ class Stats::CalculateMonth
|
||||
.points
|
||||
.without_raw_data
|
||||
.where(timestamp: start_timestamp..end_timestamp)
|
||||
.select(:city, :country_name)
|
||||
.distinct
|
||||
.select(:city, :country_name, :timestamp)
|
||||
|
||||
CountriesAndCities.new(toponym_points).call
|
||||
end
|
||||
|
||||
@@ -25,6 +25,7 @@ require 'oj'
|
||||
class Users::ImportData
|
||||
STREAM_BATCH_SIZE = 5000
|
||||
STREAMED_SECTIONS = %w[places visits points].freeze
|
||||
MAX_ENTRY_SIZE = 10.gigabytes # Maximum size for a single file in the archive
|
||||
|
||||
def initialize(user, archive_path)
|
||||
@user = user
|
||||
@@ -86,12 +87,47 @@ class Users::ImportData
|
||||
|
||||
Rails.logger.debug "Extracting #{entry.name} to #{extraction_path}"
|
||||
|
||||
# Validate entry size before extraction
|
||||
if entry.size > MAX_ENTRY_SIZE
|
||||
Rails.logger.error "Skipping oversized entry: #{entry.name} (#{entry.size} bytes exceeds #{MAX_ENTRY_SIZE} bytes)"
|
||||
raise "Archive entry #{entry.name} exceeds maximum allowed size"
|
||||
end
|
||||
|
||||
FileUtils.mkdir_p(File.dirname(extraction_path))
|
||||
entry.extract(sanitized_name, destination_directory: @import_directory)
|
||||
|
||||
# Extract with proper error handling and cleanup
|
||||
extract_entry_safely(entry, extraction_path)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
def extract_entry_safely(entry, extraction_path)
|
||||
# Extract with error handling and cleanup on failure
|
||||
begin
|
||||
entry.get_input_stream do |input|
|
||||
File.open(extraction_path, 'wb') do |output|
|
||||
bytes_copied = IO.copy_stream(input, output)
|
||||
|
||||
# Verify extracted size matches expected size
|
||||
if bytes_copied != entry.size
|
||||
raise "Size mismatch for #{entry.name}: expected #{entry.size} bytes, got #{bytes_copied} bytes"
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
Rails.logger.debug "Successfully extracted #{entry.name} (#{entry.size} bytes)"
|
||||
rescue StandardError => e
|
||||
# Clean up partial file on error
|
||||
FileUtils.rm_f(extraction_path) if File.exist?(extraction_path)
|
||||
|
||||
Rails.logger.error "Failed to extract #{entry.name}: #{e.message}"
|
||||
Rails.logger.error e.backtrace.join("\n")
|
||||
|
||||
# Re-raise to stop the import process
|
||||
raise "Extraction failed for #{entry.name}: #{e.message}"
|
||||
end
|
||||
end
|
||||
|
||||
def sanitize_zip_entry_name(entry_name)
|
||||
sanitized = entry_name.gsub(%r{^[/\\]+}, '')
|
||||
|
||||
|
||||
@@ -17,12 +17,13 @@
|
||||
data-tracks='<%= @tracks.to_json.html_safe %>'
|
||||
data-distance="<%= @distance %>"
|
||||
data-points_number="<%= @points_number %>"
|
||||
data-timezone="<%= Rails.configuration.time_zone %>"
|
||||
data-timezone="<%= current_user.timezone %>"
|
||||
data-features='<%= @features.to_json.html_safe %>'
|
||||
data-user_tags='<%= current_user.tags.ordered.select(:id, :name, :icon, :color).as_json.to_json.html_safe %>'
|
||||
data-home_coordinates='<%= @home_coordinates.to_json.html_safe %>'
|
||||
data-family-members-features-value='<%= @features.to_json.html_safe %>'
|
||||
data-family-members-user-theme-value="<%= current_user&.theme || 'dark' %>">
|
||||
data-family-members-user-theme-value="<%= current_user&.theme || 'dark' %>"
|
||||
data-family-members-timezone-value="<%= current_user.timezone %>">
|
||||
<div data-maps-target="container" class="w-full h-full">
|
||||
<div id="fog" class="fog"></div>
|
||||
</div>
|
||||
|
||||
@@ -5,8 +5,9 @@
|
||||
<div id="maps-maplibre-container"
|
||||
data-controller="maps--maplibre area-drawer maps--maplibre-realtime"
|
||||
data-maps--maplibre-api-key-value="<%= current_user.api_key %>"
|
||||
data-maps--maplibre-start-date-value="<%= @start_at.to_s %>"
|
||||
data-maps--maplibre-end-date-value="<%= @end_at.to_s %>"
|
||||
data-maps--maplibre-start-date-value="<%= @start_at.iso8601 %>"
|
||||
data-maps--maplibre-end-date-value="<%= @end_at.iso8601 %>"
|
||||
data-maps--maplibre-timezone-value="<%= current_user.timezone %>"
|
||||
data-maps--maplibre-realtime-enabled-value="true"
|
||||
style="width: 100%; height: 100%; position: relative;">
|
||||
|
||||
|
||||
@@ -131,30 +131,44 @@
|
||||
</div>
|
||||
<% end %>
|
||||
|
||||
<div class="dropdown dropdown-end dropdown-bottom dropdown-hover"
|
||||
data-controller="notifications"
|
||||
data-notifications-user-id-value="<%= current_user.id %>">
|
||||
<div tabindex="0" role="button" class='btn btn-sm btn-ghost hover:btn-ghost p-2'>
|
||||
<%= icon 'bell' %>
|
||||
<% if @unread_notifications.present? %>
|
||||
<span class="badge badge-xs badge-primary absolute top-0 right-0" data-notifications-target="badge">
|
||||
<%= @unread_notifications.size %>
|
||||
</span>
|
||||
<% end %>
|
||||
</div>
|
||||
<ul tabindex="0" class="dropdown-content z-[5000] menu p-2 shadow-lg bg-base-100 rounded-box min-w-52" data-notifications-target="list">
|
||||
<li><%= link_to 'See all', notifications_path %></li>
|
||||
<% @unread_notifications.first(10).each do |notification| %>
|
||||
<div class="divider p-0 m-0"></div>
|
||||
<li class='notification-item'>
|
||||
<%= link_to notification do %>
|
||||
<%= notification.title %>
|
||||
<div class="badge badge-xs justify-self-end badge-<%= notification.kind %>"></div>
|
||||
<% end %>
|
||||
</li>
|
||||
<% end %>
|
||||
</ul>
|
||||
</div>
|
||||
<li data-controller="notifications"
|
||||
data-notifications-user-id-value="<%= current_user.id %>">
|
||||
<details>
|
||||
<summary class="relative">
|
||||
<%= icon 'bell' %>
|
||||
<% if @unread_notifications.present? %>
|
||||
<span class="badge badge-xs badge-primary absolute top-0 right-0" data-notifications-target="badge">
|
||||
<%= @unread_notifications.size %>
|
||||
</span>
|
||||
<% end %>
|
||||
</summary>
|
||||
<ul class="p-2 bg-base-100 rounded-t-none z-10 min-w-52" data-notifications-target="list">
|
||||
<li><%= link_to 'See all', notifications_path %></li>
|
||||
<% @unread_notifications.first(10).each do |notification| %>
|
||||
<div class="divider p-0 m-0"></div>
|
||||
<li class='notification-item'>
|
||||
<%= link_to notification do %>
|
||||
<%= notification.title %>
|
||||
<div class="badge badge-xs justify-self-end badge-<%= notification.kind %>"></div>
|
||||
<% end %>
|
||||
</li>
|
||||
<% end %>
|
||||
</ul>
|
||||
</details>
|
||||
</li>
|
||||
<li>
|
||||
<details>
|
||||
<summary><%= icon 'message-circle-question-mark' %></summary>
|
||||
<ul class="p-2 bg-base-100 rounded-box shadow-md z-10 w-52">
|
||||
<li><p>Need help? Ping us! <%= icon 'arrow-big-down' %></p></li>
|
||||
<li><%= link_to 'X (Twitter)', 'https://x.com/freymakesstuff', target: '_blank', rel: 'noopener noreferrer' %></li>
|
||||
<li><%= link_to 'Mastodon', 'https://mastodon.social/@dawarich', target: '_blank', rel: 'noopener noreferrer' %></li>
|
||||
<li><%= link_to 'Email', 'mailto:hi@dawarich.app' %></li>
|
||||
<li><%= link_to 'Forum', 'https://discourse.dawarich.app', target: '_blank', rel: 'noopener noreferrer' %></li>
|
||||
<li><%= link_to 'Discord', 'https://discord.gg/pHsBjpt5J8', target: '_blank', rel: 'noopener noreferrer' %></li>
|
||||
</ul>
|
||||
</details>
|
||||
</li>
|
||||
<li>
|
||||
<details>
|
||||
<summary>
|
||||
|
||||
@@ -63,7 +63,8 @@
|
||||
data-public-stat-map-uuid-value="<%= @stat.sharing_uuid %>"
|
||||
data-public-stat-map-data-bounds-value="<%= @data_bounds.to_json if @data_bounds %>"
|
||||
data-public-stat-map-hexagons-available-value="<%= @hexagons_available.to_s %>"
|
||||
data-public-stat-map-self-hosted-value="<%= @self_hosted %>"></div>
|
||||
data-public-stat-map-self-hosted-value="<%= @self_hosted %>"
|
||||
data-public-stat-map-timezone-value="<%= @user.timezone %>"></div>
|
||||
|
||||
<!-- Loading overlay -->
|
||||
<div id="map-loading" class="absolute inset-0 bg-base-200 bg-opacity-80 flex items-center justify-center z-50">
|
||||
|
||||
@@ -22,7 +22,7 @@
|
||||
data-path="<%= trip.path.to_json %>"
|
||||
data-started_at="<%= trip.started_at %>"
|
||||
data-ended_at="<%= trip.ended_at %>"
|
||||
data-timezone="<%= Rails.configuration.time_zone %>">
|
||||
data-timezone="<%= current_user.timezone %>">
|
||||
</div>
|
||||
</div>
|
||||
|
||||
|
||||
@@ -9,7 +9,7 @@
|
||||
data-path="<%= trip.path.coordinates.to_json %>"
|
||||
data-started_at="<%= trip.started_at %>"
|
||||
data-ended_at="<%= trip.ended_at %>"
|
||||
data-timezone="<%= Rails.configuration.time_zone %>">
|
||||
data-timezone="<%= trip.user.timezone %>">
|
||||
<div data-trips-target="container" class="h-[25rem] w-full min-h-screen md:h-64">
|
||||
</div>
|
||||
</div>
|
||||
|
||||
@@ -16,7 +16,7 @@
|
||||
data-trip-map-path-value="<%= trip.path.coordinates.to_json %>"
|
||||
data-trip-map-api-key-value="<%= current_user.api_key %>"
|
||||
data-trip-map-user-settings-value="<%= current_user.safe_settings.settings.to_json %>"
|
||||
data-trip-map-timezone-value="<%= Rails.configuration.time_zone %>">
|
||||
data-trip-map-timezone-value="<%= trip.user.timezone %>">
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
@@ -37,6 +37,6 @@ module Dawarich
|
||||
|
||||
config.active_job.queue_adapter = :sidekiq
|
||||
|
||||
config.action_mailer.preview_paths << "#{Rails.root}/spec/mailers/previews"
|
||||
config.action_mailer.preview_paths << "#{Rails.root.join('spec/mailers/previews')}"
|
||||
end
|
||||
end
|
||||
|
||||
@@ -103,7 +103,7 @@ Rails.application.configure do
|
||||
# /.*\.example\.com/ # Allow requests from subdomains like `www.example.com`
|
||||
# ]
|
||||
# Skip DNS rebinding protection for the health check endpoint.
|
||||
config.host_authorization = { exclude: ->(request) { request.path == "/api/v1/health" } }
|
||||
config.host_authorization = { exclude: ->(request) { request.path == '/api/v1/health' } }
|
||||
hosts = ENV.fetch('APPLICATION_HOSTS', 'localhost').split(',').map(&:strip)
|
||||
|
||||
config.action_mailer.default_url_options = { host: ENV['DOMAIN'] }
|
||||
|
||||
@@ -62,3 +62,6 @@ OIDC_AUTO_REGISTER = ENV.fetch('OIDC_AUTO_REGISTER', 'true') == 'true'
|
||||
|
||||
# Email/password registration setting (default: false for self-hosted, true for cloud)
|
||||
ALLOW_EMAIL_PASSWORD_REGISTRATION = ENV.fetch('ALLOW_EMAIL_PASSWORD_REGISTRATION', 'false') == 'true'
|
||||
|
||||
# Raw data archival setting
|
||||
ARCHIVE_RAW_DATA = ENV.fetch('ARCHIVE_RAW_DATA', 'false') == 'true'
|
||||
|
||||
@@ -49,5 +49,9 @@ class DawarichSettings
|
||||
family: family_feature_enabled?
|
||||
}
|
||||
end
|
||||
|
||||
def archive_raw_data_enabled?
|
||||
@archive_raw_data_enabled ||= ARCHIVE_RAW_DATA
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
@@ -2,14 +2,17 @@
|
||||
|
||||
require 'aws-sdk-core'
|
||||
|
||||
# Support both AWS_ENDPOINT and AWS_ENDPOINT_URL for backwards compatibility
|
||||
endpoint_url = ENV['AWS_ENDPOINT_URL'] || ENV['AWS_ENDPOINT']
|
||||
|
||||
if ENV['AWS_ACCESS_KEY_ID'] &&
|
||||
ENV['AWS_SECRET_ACCESS_KEY'] &&
|
||||
ENV['AWS_REGION'] &&
|
||||
ENV['AWS_ENDPOINT']
|
||||
endpoint_url
|
||||
Aws.config.update(
|
||||
{
|
||||
region: ENV['AWS_REGION'],
|
||||
endpoint: ENV['AWS_ENDPOINT'],
|
||||
endpoint: endpoint_url,
|
||||
credentials: Aws::Credentials.new(ENV['AWS_ACCESS_KEY_ID'], ENV['AWS_SECRET_ACCESS_KEY'])
|
||||
}
|
||||
)
|
||||
|
||||
@@ -24,7 +24,7 @@ Sidekiq.configure_server do |config|
|
||||
end
|
||||
|
||||
Sidekiq.configure_client do |config|
|
||||
config.redis = { url: "#{ENV['REDIS_URL']}/#{ENV.fetch('RAILS_JOB_QUEUE_DB', 1)}" }
|
||||
config.redis = { url: ENV['REDIS_URL'], db: ENV.fetch('RAILS_JOB_QUEUE_DB', 1) }
|
||||
end
|
||||
|
||||
Sidekiq::Queue['reverse_geocoding'].limit = 1 if Sidekiq.server? && DawarichSettings.photon_uses_komoot_io?
|
||||
|
||||
@@ -16,3 +16,4 @@
|
||||
- places
|
||||
- app_version_checking
|
||||
- cache
|
||||
- archival
|
||||
|
||||
@@ -7,14 +7,16 @@ local:
|
||||
root: <%= Rails.root.join("storage") %>
|
||||
|
||||
# Only load S3 config if not in test environment
|
||||
<% if !Rails.env.test? && ENV['AWS_ACCESS_KEY_ID'] && ENV['AWS_SECRET_ACCESS_KEY'] && ENV['AWS_REGION'] && ENV['AWS_BUCKET'] && ENV['AWS_ENDPOINT_URL'] %>
|
||||
# Support both AWS_ENDPOINT and AWS_ENDPOINT_URL for backwards compatibility
|
||||
<% endpoint_url = ENV['AWS_ENDPOINT_URL'] || ENV['AWS_ENDPOINT'] %>
|
||||
<% if !Rails.env.test? && ENV['AWS_ACCESS_KEY_ID'] && ENV['AWS_SECRET_ACCESS_KEY'] && ENV['AWS_REGION'] && ENV['AWS_BUCKET'] && endpoint_url %>
|
||||
s3:
|
||||
service: S3
|
||||
access_key_id: <%= ENV.fetch("AWS_ACCESS_KEY_ID") %>
|
||||
secret_access_key: <%= ENV.fetch("AWS_SECRET_ACCESS_KEY") %>
|
||||
region: <%= ENV.fetch("AWS_REGION") %>
|
||||
bucket: <%= ENV.fetch("AWS_BUCKET") %>
|
||||
endpoint: <%= ENV.fetch("AWS_ENDPOINT_URL") %>
|
||||
endpoint: <%= endpoint_url %>
|
||||
<% end %>
|
||||
|
||||
# Remember not to checkin your GCS keyfile to a repository
|
||||
|
||||
23
db/migrate/20251206000001_create_points_raw_data_archives.rb
Normal file
23
db/migrate/20251206000001_create_points_raw_data_archives.rb
Normal file
@@ -0,0 +1,23 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
class CreatePointsRawDataArchives < ActiveRecord::Migration[8.0]
|
||||
def change
|
||||
create_table :points_raw_data_archives do |t|
|
||||
t.bigint :user_id, null: false
|
||||
t.integer :year, null: false
|
||||
t.integer :month, null: false
|
||||
t.integer :chunk_number, null: false, default: 1
|
||||
t.integer :point_count, null: false
|
||||
t.string :point_ids_checksum, null: false
|
||||
t.jsonb :metadata, default: {}, null: false
|
||||
t.datetime :archived_at, null: false
|
||||
|
||||
t.timestamps
|
||||
end
|
||||
|
||||
add_index :points_raw_data_archives, :user_id
|
||||
add_index :points_raw_data_archives, [:user_id, :year, :month]
|
||||
add_index :points_raw_data_archives, :archived_at
|
||||
add_foreign_key :points_raw_data_archives, :users, validate: false
|
||||
end
|
||||
end
|
||||
22
db/migrate/20251206000002_add_archival_columns_to_points.rb
Normal file
22
db/migrate/20251206000002_add_archival_columns_to_points.rb
Normal file
@@ -0,0 +1,22 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
class AddArchivalColumnsToPoints < ActiveRecord::Migration[8.0]
|
||||
disable_ddl_transaction!
|
||||
|
||||
def change
|
||||
add_column :points, :raw_data_archived, :boolean, default: false, null: false
|
||||
add_column :points, :raw_data_archive_id, :bigint, null: true
|
||||
|
||||
add_index :points, :raw_data_archived,
|
||||
where: 'raw_data_archived = true',
|
||||
name: 'index_points_on_archived_true',
|
||||
algorithm: :concurrently
|
||||
add_index :points, :raw_data_archive_id,
|
||||
algorithm: :concurrently
|
||||
|
||||
add_foreign_key :points, :points_raw_data_archives,
|
||||
column: :raw_data_archive_id,
|
||||
on_delete: :nullify, # Don't delete points if archive deleted
|
||||
validate: false
|
||||
end
|
||||
end
|
||||
@@ -0,0 +1,8 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
class ValidateArchivalForeignKeys < ActiveRecord::Migration[8.0]
|
||||
def change
|
||||
validate_foreign_key :points_raw_data_archives, :users
|
||||
validate_foreign_key :points, :points_raw_data_archives
|
||||
end
|
||||
end
|
||||
18
db/migrate/20251208210410_add_composite_index_to_stats.rb
Normal file
18
db/migrate/20251208210410_add_composite_index_to_stats.rb
Normal file
@@ -0,0 +1,18 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
class AddCompositeIndexToStats < ActiveRecord::Migration[8.0]
|
||||
disable_ddl_transaction!
|
||||
|
||||
def change
|
||||
# Add composite index for the most common stats lookup pattern:
|
||||
# Stat.find_or_initialize_by(year:, month:, user:)
|
||||
# This query is called on EVERY stats calculation
|
||||
#
|
||||
# Using algorithm: :concurrently to avoid locking the table during index creation
|
||||
# This is crucial for production deployments with existing data
|
||||
add_index :stats, %i[user_id year month],
|
||||
name: 'index_stats_on_user_id_year_month',
|
||||
unique: true,
|
||||
algorithm: :concurrently
|
||||
end
|
||||
end
|
||||
@@ -0,0 +1,5 @@
|
||||
class AddVerifiedAtToPointsRawDataArchives < ActiveRecord::Migration[8.0]
|
||||
def change
|
||||
add_column :points_raw_data_archives, :verified_at, :datetime
|
||||
end
|
||||
end
|
||||
27
db/schema.rb
generated
27
db/schema.rb
generated
@@ -10,7 +10,7 @@
|
||||
#
|
||||
# It's strongly recommended that you check this file into your version control system.
|
||||
|
||||
ActiveRecord::Schema[8.0].define(version: 2025_12_01_192510) do
|
||||
ActiveRecord::Schema[8.0].define(version: 2025_12_10_193532) do
|
||||
# These are extensions that must be enabled in order to support this database
|
||||
enable_extension "pg_catalog.plpgsql"
|
||||
enable_extension "postgis"
|
||||
@@ -224,6 +224,8 @@ ActiveRecord::Schema[8.0].define(version: 2025_12_01_192510) do
|
||||
t.bigint "country_id"
|
||||
t.bigint "track_id"
|
||||
t.string "country_name"
|
||||
t.boolean "raw_data_archived", default: false, null: false
|
||||
t.bigint "raw_data_archive_id"
|
||||
t.index ["altitude"], name: "index_points_on_altitude"
|
||||
t.index ["battery"], name: "index_points_on_battery"
|
||||
t.index ["battery_status"], name: "index_points_on_battery_status"
|
||||
@@ -238,6 +240,8 @@ ActiveRecord::Schema[8.0].define(version: 2025_12_01_192510) do
|
||||
t.index ["latitude", "longitude"], name: "index_points_on_latitude_and_longitude"
|
||||
t.index ["lonlat", "timestamp", "user_id"], name: "index_points_on_lonlat_timestamp_user_id", unique: true
|
||||
t.index ["lonlat"], name: "index_points_on_lonlat", using: :gist
|
||||
t.index ["raw_data_archive_id"], name: "index_points_on_raw_data_archive_id"
|
||||
t.index ["raw_data_archived"], name: "index_points_on_archived_true", where: "(raw_data_archived = true)"
|
||||
t.index ["reverse_geocoded_at"], name: "index_points_on_reverse_geocoded_at"
|
||||
t.index ["timestamp"], name: "index_points_on_timestamp"
|
||||
t.index ["track_id"], name: "index_points_on_track_id"
|
||||
@@ -249,6 +253,23 @@ ActiveRecord::Schema[8.0].define(version: 2025_12_01_192510) do
|
||||
t.index ["visit_id"], name: "index_points_on_visit_id"
|
||||
end
|
||||
|
||||
create_table "points_raw_data_archives", force: :cascade do |t|
|
||||
t.bigint "user_id", null: false
|
||||
t.integer "year", null: false
|
||||
t.integer "month", null: false
|
||||
t.integer "chunk_number", default: 1, null: false
|
||||
t.integer "point_count", null: false
|
||||
t.string "point_ids_checksum", null: false
|
||||
t.jsonb "metadata", default: {}, null: false
|
||||
t.datetime "archived_at", null: false
|
||||
t.datetime "created_at", null: false
|
||||
t.datetime "updated_at", null: false
|
||||
t.datetime "verified_at"
|
||||
t.index ["archived_at"], name: "index_points_raw_data_archives_on_archived_at"
|
||||
t.index ["user_id", "year", "month"], name: "index_points_raw_data_archives_on_user_id_and_year_and_month"
|
||||
t.index ["user_id"], name: "index_points_raw_data_archives_on_user_id"
|
||||
end
|
||||
|
||||
create_table "stats", force: :cascade do |t|
|
||||
t.integer "year", null: false
|
||||
t.integer "month", null: false
|
||||
@@ -265,6 +286,7 @@ ActiveRecord::Schema[8.0].define(version: 2025_12_01_192510) do
|
||||
t.index ["h3_hex_ids"], name: "index_stats_on_h3_hex_ids", where: "((h3_hex_ids IS NOT NULL) AND (h3_hex_ids <> '{}'::jsonb))", using: :gin
|
||||
t.index ["month"], name: "index_stats_on_month"
|
||||
t.index ["sharing_uuid"], name: "index_stats_on_sharing_uuid", unique: true
|
||||
t.index ["user_id", "year", "month"], name: "index_stats_on_user_id_year_month", unique: true
|
||||
t.index ["user_id"], name: "index_stats_on_user_id"
|
||||
t.index ["year"], name: "index_stats_on_year"
|
||||
end
|
||||
@@ -351,6 +373,7 @@ ActiveRecord::Schema[8.0].define(version: 2025_12_01_192510) do
|
||||
t.string "utm_term"
|
||||
t.string "utm_content"
|
||||
t.index ["email"], name: "index_users_on_email", unique: true
|
||||
t.index ["provider", "uid"], name: "index_users_on_provider_and_uid", unique: true
|
||||
t.index ["reset_password_token"], name: "index_users_on_reset_password_token", unique: true
|
||||
end
|
||||
|
||||
@@ -384,8 +407,10 @@ ActiveRecord::Schema[8.0].define(version: 2025_12_01_192510) do
|
||||
add_foreign_key "notifications", "users"
|
||||
add_foreign_key "place_visits", "places"
|
||||
add_foreign_key "place_visits", "visits"
|
||||
add_foreign_key "points", "points_raw_data_archives", column: "raw_data_archive_id", on_delete: :nullify
|
||||
add_foreign_key "points", "users"
|
||||
add_foreign_key "points", "visits"
|
||||
add_foreign_key "points_raw_data_archives", "users"
|
||||
add_foreign_key "stats", "users"
|
||||
add_foreign_key "taggings", "tags"
|
||||
add_foreign_key "tags", "users"
|
||||
|
||||
@@ -4,7 +4,8 @@ import {
|
||||
navigateToMapsV2WithDate,
|
||||
waitForLoadingComplete,
|
||||
hasLayer,
|
||||
getPointsSourceData
|
||||
getPointsSourceData,
|
||||
getRoutesSourceData
|
||||
} from '../../helpers/setup.js'
|
||||
|
||||
test.describe('Points Layer', () => {
|
||||
@@ -68,4 +69,424 @@ test.describe('Points Layer', () => {
|
||||
}
|
||||
})
|
||||
})
|
||||
|
||||
test.describe('Dragging', () => {
|
||||
test('allows dragging points to new positions', async ({ page }) => {
|
||||
// Wait for points to load
|
||||
await page.waitForFunction(() => {
|
||||
const element = document.querySelector('[data-controller*="maps--maplibre"]')
|
||||
const app = window.Stimulus || window.Application
|
||||
const controller = app?.getControllerForElementAndIdentifier(element, 'maps--maplibre')
|
||||
const source = controller?.map?.getSource('points-source')
|
||||
return source?._data?.features?.length > 0
|
||||
}, { timeout: 15000 })
|
||||
|
||||
// Get initial point data
|
||||
const initialData = await getPointsSourceData(page)
|
||||
expect(initialData.features.length).toBeGreaterThan(0)
|
||||
|
||||
|
||||
// Get the map canvas bounds
|
||||
const canvas = page.locator('.maplibregl-canvas')
|
||||
const canvasBounds = await canvas.boundingBox()
|
||||
expect(canvasBounds).not.toBeNull()
|
||||
|
||||
// Ensure points layer is visible before testing dragging
|
||||
const layerState = await page.evaluate(() => {
|
||||
const element = document.querySelector('[data-controller*="maps--maplibre"]')
|
||||
const app = window.Stimulus || window.Application
|
||||
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
|
||||
const pointsLayer = controller?.layerManager?.layers?.pointsLayer
|
||||
|
||||
if (!pointsLayer) {
|
||||
return { exists: false, visibleBefore: false, visibleAfter: false, draggingEnabled: false }
|
||||
}
|
||||
|
||||
const visibilityBefore = controller.map.getLayoutProperty('points', 'visibility')
|
||||
const isVisibleBefore = visibilityBefore === 'visible' || visibilityBefore === undefined
|
||||
|
||||
// If not visible, make it visible
|
||||
if (!isVisibleBefore) {
|
||||
pointsLayer.show()
|
||||
}
|
||||
|
||||
// Check again after calling show
|
||||
const visibilityAfter = controller.map.getLayoutProperty('points', 'visibility')
|
||||
const isVisibleAfter = visibilityAfter === 'visible' || visibilityAfter === undefined
|
||||
|
||||
return {
|
||||
exists: true,
|
||||
visibleBefore: isVisibleBefore,
|
||||
visibleAfter: isVisibleAfter,
|
||||
draggingEnabled: pointsLayer.draggingEnabled || false
|
||||
}
|
||||
})
|
||||
|
||||
|
||||
// Wait longer for layer to render after visibility change
|
||||
await page.waitForTimeout(2000)
|
||||
|
||||
// Find a rendered point feature on the map and get its pixel coordinates
|
||||
const renderedPoint = await page.evaluate(() => {
|
||||
const element = document.querySelector('[data-controller*="maps--maplibre"]')
|
||||
const app = window.Stimulus || window.Application
|
||||
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
|
||||
|
||||
// Get all rendered point features
|
||||
const features = controller.map.queryRenderedFeatures(undefined, { layers: ['points'] })
|
||||
|
||||
if (features.length === 0) {
|
||||
return { found: false, totalFeatures: 0 }
|
||||
}
|
||||
|
||||
// Pick the first rendered point
|
||||
const feature = features[0]
|
||||
const coords = feature.geometry.coordinates
|
||||
const point = controller.map.project(coords)
|
||||
|
||||
// Get the canvas position on the page
|
||||
const canvas = controller.map.getCanvas()
|
||||
const rect = canvas.getBoundingClientRect()
|
||||
|
||||
return {
|
||||
found: true,
|
||||
totalFeatures: features.length,
|
||||
pointId: feature.properties.id,
|
||||
coords: coords,
|
||||
x: point.x,
|
||||
y: point.y,
|
||||
pageX: rect.left + point.x,
|
||||
pageY: rect.top + point.y
|
||||
}
|
||||
})
|
||||
|
||||
|
||||
expect(renderedPoint.found).toBe(true)
|
||||
expect(renderedPoint.totalFeatures).toBeGreaterThan(0)
|
||||
|
||||
const pointId = renderedPoint.pointId
|
||||
const initialCoords = renderedPoint.coords
|
||||
const pointPixel = {
|
||||
x: renderedPoint.x,
|
||||
y: renderedPoint.y,
|
||||
pageX: renderedPoint.pageX,
|
||||
pageY: renderedPoint.pageY
|
||||
}
|
||||
|
||||
|
||||
// Drag the point by 100 pixels to the right and 100 down (larger movement for visibility)
|
||||
const dragOffset = { x: 100, y: 100 }
|
||||
const startX = pointPixel.pageX
|
||||
const startY = pointPixel.pageY
|
||||
const endX = startX + dragOffset.x
|
||||
const endY = startY + dragOffset.y
|
||||
|
||||
|
||||
// Check cursor style on hover
|
||||
await page.mouse.move(startX, startY)
|
||||
await page.waitForTimeout(200)
|
||||
|
||||
const cursorStyle = await page.evaluate(() => {
|
||||
const canvas = document.querySelector('.maplibregl-canvas-container')
|
||||
return window.getComputedStyle(canvas).cursor
|
||||
})
|
||||
|
||||
// Perform the drag operation with slower movement
|
||||
await page.mouse.down()
|
||||
await page.waitForTimeout(100)
|
||||
await page.mouse.move(endX, endY, { steps: 20 })
|
||||
await page.waitForTimeout(100)
|
||||
await page.mouse.up()
|
||||
|
||||
// Wait for API call to complete
|
||||
await page.waitForTimeout(3000)
|
||||
|
||||
// Get updated point data
|
||||
const updatedData = await getPointsSourceData(page)
|
||||
const updatedPoint = updatedData.features.find(f => f.properties.id === pointId)
|
||||
|
||||
expect(updatedPoint).toBeDefined()
|
||||
const updatedCoords = updatedPoint.geometry.coordinates
|
||||
|
||||
|
||||
// Verify the point has moved (parse coordinates as numbers)
|
||||
const updatedLng = parseFloat(updatedCoords[0])
|
||||
const updatedLat = parseFloat(updatedCoords[1])
|
||||
const initialLng = parseFloat(initialCoords[0])
|
||||
const initialLat = parseFloat(initialCoords[1])
|
||||
|
||||
expect(updatedLng).not.toBeCloseTo(initialLng, 5)
|
||||
expect(updatedLat).not.toBeCloseTo(initialLat, 5)
|
||||
})
|
||||
|
||||
test('updates connected route segments when point is dragged', async ({ page }) => {
|
||||
// Wait for both points and routes to load
|
||||
await page.waitForFunction(() => {
|
||||
const element = document.querySelector('[data-controller*="maps--maplibre"]')
|
||||
const app = window.Stimulus || window.Application
|
||||
const controller = app?.getControllerForElementAndIdentifier(element, 'maps--maplibre')
|
||||
const pointsSource = controller?.map?.getSource('points-source')
|
||||
const routesSource = controller?.map?.getSource('routes-source')
|
||||
return pointsSource?._data?.features?.length > 0 &&
|
||||
routesSource?._data?.features?.length > 0
|
||||
}, { timeout: 15000 })
|
||||
|
||||
// Ensure points layer is visible
|
||||
await page.evaluate(() => {
|
||||
const element = document.querySelector('[data-controller*="maps--maplibre"]')
|
||||
const app = window.Stimulus || window.Application
|
||||
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
|
||||
const pointsLayer = controller?.layerManager?.layers?.pointsLayer
|
||||
if (pointsLayer) {
|
||||
const visibility = controller.map.getLayoutProperty('points', 'visibility')
|
||||
if (visibility === 'none') {
|
||||
pointsLayer.show()
|
||||
}
|
||||
}
|
||||
})
|
||||
|
||||
await page.waitForTimeout(2000)
|
||||
|
||||
// Get initial data
|
||||
const initialRoutesData = await getRoutesSourceData(page)
|
||||
expect(initialRoutesData.features.length).toBeGreaterThan(0)
|
||||
|
||||
// Find a rendered point feature on the map
|
||||
const renderedPoint = await page.evaluate(() => {
|
||||
const element = document.querySelector('[data-controller*="maps--maplibre"]')
|
||||
const app = window.Stimulus || window.Application
|
||||
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
|
||||
|
||||
// Get all rendered point features
|
||||
const features = controller.map.queryRenderedFeatures(undefined, { layers: ['points'] })
|
||||
|
||||
if (features.length === 0) {
|
||||
return { found: false }
|
||||
}
|
||||
|
||||
// Pick the first rendered point
|
||||
const feature = features[0]
|
||||
const coords = feature.geometry.coordinates
|
||||
const point = controller.map.project(coords)
|
||||
|
||||
// Get the canvas position on the page
|
||||
const canvas = controller.map.getCanvas()
|
||||
const rect = canvas.getBoundingClientRect()
|
||||
|
||||
return {
|
||||
found: true,
|
||||
pointId: feature.properties.id,
|
||||
coords: coords,
|
||||
x: point.x,
|
||||
y: point.y,
|
||||
pageX: rect.left + point.x,
|
||||
pageY: rect.top + point.y
|
||||
}
|
||||
})
|
||||
|
||||
expect(renderedPoint.found).toBe(true)
|
||||
|
||||
const pointId = renderedPoint.pointId
|
||||
const initialCoords = renderedPoint.coords
|
||||
const pointPixel = {
|
||||
x: renderedPoint.x,
|
||||
y: renderedPoint.y,
|
||||
pageX: renderedPoint.pageX,
|
||||
pageY: renderedPoint.pageY
|
||||
}
|
||||
|
||||
// Find routes that contain this point
|
||||
const connectedRoutes = initialRoutesData.features.filter(route => {
|
||||
return route.geometry.coordinates.some(coord =>
|
||||
Math.abs(coord[0] - initialCoords[0]) < 0.0001 &&
|
||||
Math.abs(coord[1] - initialCoords[1]) < 0.0001
|
||||
)
|
||||
})
|
||||
|
||||
|
||||
const dragOffset = { x: 100, y: 100 }
|
||||
const startX = pointPixel.pageX
|
||||
const startY = pointPixel.pageY
|
||||
const endX = startX + dragOffset.x
|
||||
const endY = startY + dragOffset.y
|
||||
|
||||
// Perform drag with slower movement
|
||||
await page.mouse.move(startX, startY)
|
||||
await page.waitForTimeout(100)
|
||||
await page.mouse.down()
|
||||
await page.waitForTimeout(100)
|
||||
await page.mouse.move(endX, endY, { steps: 20 })
|
||||
await page.waitForTimeout(100)
|
||||
await page.mouse.up()
|
||||
|
||||
// Wait for updates
|
||||
await page.waitForTimeout(3000)
|
||||
|
||||
// Get updated data
|
||||
const updatedPointsData = await getPointsSourceData(page)
|
||||
const updatedRoutesData = await getRoutesSourceData(page)
|
||||
|
||||
const updatedPoint = updatedPointsData.features.find(f => f.properties.id === pointId)
|
||||
const updatedCoords = updatedPoint.geometry.coordinates
|
||||
|
||||
// Verify routes have been updated
|
||||
const updatedConnectedRoutes = updatedRoutesData.features.filter(route => {
|
||||
return route.geometry.coordinates.some(coord =>
|
||||
Math.abs(coord[0] - updatedCoords[0]) < 0.0001 &&
|
||||
Math.abs(coord[1] - updatedCoords[1]) < 0.0001
|
||||
)
|
||||
})
|
||||
|
||||
|
||||
// Routes that were originally connected should now be at the new position
|
||||
if (connectedRoutes.length > 0) {
|
||||
expect(updatedConnectedRoutes.length).toBeGreaterThan(0)
|
||||
}
|
||||
|
||||
// The point moved, so verify the coordinates actually changed
|
||||
const lngChanged = Math.abs(parseFloat(updatedCoords[0]) - initialCoords[0]) > 0.0001
|
||||
const latChanged = Math.abs(parseFloat(updatedCoords[1]) - initialCoords[1]) > 0.0001
|
||||
|
||||
expect(lngChanged || latChanged).toBe(true)
|
||||
|
||||
// Since the route segments update is best-effort (depends on coordinate matching),
|
||||
// we'll just verify that routes exist and the point moved
|
||||
})
|
||||
|
||||
test('persists point position after page reload', async ({ page }) => {
|
||||
// Wait for points to load
|
||||
await page.waitForFunction(() => {
|
||||
const element = document.querySelector('[data-controller*="maps--maplibre"]')
|
||||
const app = window.Stimulus || window.Application
|
||||
const controller = app?.getControllerForElementAndIdentifier(element, 'maps--maplibre')
|
||||
const source = controller?.map?.getSource('points-source')
|
||||
return source?._data?.features?.length > 0
|
||||
}, { timeout: 15000 })
|
||||
|
||||
// Ensure points layer is visible
|
||||
await page.evaluate(() => {
|
||||
const element = document.querySelector('[data-controller*="maps--maplibre"]')
|
||||
const app = window.Stimulus || window.Application
|
||||
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
|
||||
const pointsLayer = controller?.layerManager?.layers?.pointsLayer
|
||||
if (pointsLayer) {
|
||||
const visibility = controller.map.getLayoutProperty('points', 'visibility')
|
||||
if (visibility === 'none') {
|
||||
pointsLayer.show()
|
||||
}
|
||||
}
|
||||
})
|
||||
|
||||
await page.waitForTimeout(2000)
|
||||
|
||||
// Find a rendered point feature on the map
|
||||
const renderedPoint = await page.evaluate(() => {
|
||||
const element = document.querySelector('[data-controller*="maps--maplibre"]')
|
||||
const app = window.Stimulus || window.Application
|
||||
const controller = app.getControllerForElementAndIdentifier(element, 'maps--maplibre')
|
||||
|
||||
// Get all rendered point features
|
||||
const features = controller.map.queryRenderedFeatures(undefined, { layers: ['points'] })
|
||||
|
||||
if (features.length === 0) {
|
||||
return { found: false }
|
||||
}
|
||||
|
||||
// Pick the first rendered point
|
||||
const feature = features[0]
|
||||
const coords = feature.geometry.coordinates
|
||||
const point = controller.map.project(coords)
|
||||
|
||||
// Get the canvas position on the page
|
||||
const canvas = controller.map.getCanvas()
|
||||
const rect = canvas.getBoundingClientRect()
|
||||
|
||||
return {
|
||||
found: true,
|
||||
pointId: feature.properties.id,
|
||||
coords: coords,
|
||||
x: point.x,
|
||||
y: point.y,
|
||||
pageX: rect.left + point.x,
|
||||
pageY: rect.top + point.y
|
||||
}
|
||||
})
|
||||
|
||||
expect(renderedPoint.found).toBe(true)
|
||||
|
||||
const pointId = renderedPoint.pointId
|
||||
const initialCoords = renderedPoint.coords
|
||||
const pointPixel = {
|
||||
x: renderedPoint.x,
|
||||
y: renderedPoint.y,
|
||||
pageX: renderedPoint.pageX,
|
||||
pageY: renderedPoint.pageY
|
||||
}
|
||||
|
||||
|
||||
const dragOffset = { x: 100, y: 100 }
|
||||
const startX = pointPixel.pageX
|
||||
const startY = pointPixel.pageY
|
||||
const endX = startX + dragOffset.x
|
||||
const endY = startY + dragOffset.y
|
||||
|
||||
// Perform drag with slower movement
|
||||
await page.mouse.move(startX, startY)
|
||||
await page.waitForTimeout(100)
|
||||
await page.mouse.down()
|
||||
await page.waitForTimeout(100)
|
||||
await page.mouse.move(endX, endY, { steps: 20 })
|
||||
await page.waitForTimeout(100)
|
||||
await page.mouse.up()
|
||||
|
||||
// Wait for API call
|
||||
await page.waitForTimeout(3000)
|
||||
|
||||
// Get the new position
|
||||
const afterDragData = await getPointsSourceData(page)
|
||||
const afterDragPoint = afterDragData.features.find(f => f.properties.id === pointId)
|
||||
const afterDragCoords = afterDragPoint.geometry.coordinates
|
||||
|
||||
|
||||
// Reload the page
|
||||
await page.reload()
|
||||
await closeOnboardingModal(page)
|
||||
await waitForLoadingComplete(page)
|
||||
await page.waitForTimeout(1500)
|
||||
|
||||
// Wait for points to reload
|
||||
await page.waitForFunction(() => {
|
||||
const element = document.querySelector('[data-controller*="maps--maplibre"]')
|
||||
const app = window.Stimulus || window.Application
|
||||
const controller = app?.getControllerForElementAndIdentifier(element, 'maps--maplibre')
|
||||
const source = controller?.map?.getSource('points-source')
|
||||
return source?._data?.features?.length > 0
|
||||
}, { timeout: 15000 })
|
||||
|
||||
// Get point after reload
|
||||
const afterReloadData = await getPointsSourceData(page)
|
||||
const afterReloadPoint = afterReloadData.features.find(f => f.properties.id === pointId)
|
||||
const afterReloadCoords = afterReloadPoint.geometry.coordinates
|
||||
|
||||
|
||||
// Verify the position persisted (parse coordinates as numbers)
|
||||
const reloadLng = parseFloat(afterReloadCoords[0])
|
||||
const reloadLat = parseFloat(afterReloadCoords[1])
|
||||
const dragLng = parseFloat(afterDragCoords[0])
|
||||
const dragLat = parseFloat(afterDragCoords[1])
|
||||
const initialLng = parseFloat(initialCoords[0])
|
||||
const initialLat = parseFloat(initialCoords[1])
|
||||
|
||||
// Position after reload should match position after drag (high precision)
|
||||
expect(reloadLng).toBeCloseTo(dragLng, 5)
|
||||
expect(reloadLat).toBeCloseTo(dragLat, 5)
|
||||
|
||||
// And it should be different from the initial position (lower precision - just verify it moved)
|
||||
const lngDiff = Math.abs(reloadLng - initialLng)
|
||||
const latDiff = Math.abs(reloadLat - initialLat)
|
||||
const moved = lngDiff > 0.00001 || latDiff > 0.00001
|
||||
|
||||
expect(moved).toBe(true)
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
@@ -124,8 +124,16 @@ test.describe('Routes Layer', () => {
|
||||
|
||||
expect(routeLayerInfo).toBeTruthy()
|
||||
expect(routeLayerInfo.exists).toBe(true)
|
||||
expect(routeLayerInfo.isArray).toBe(false)
|
||||
expect(routeLayerInfo.value).toBe('#f97316')
|
||||
|
||||
// Route color is now a MapLibre expression that supports dynamic colors
|
||||
// Format: ['case', ['has', 'color'], ['get', 'color'], '#0000ff']
|
||||
if (routeLayerInfo.isArray) {
|
||||
// It's a MapLibre expression, check the default color (last element)
|
||||
expect(routeLayerInfo.value[routeLayerInfo.value.length - 1]).toBe('#0000ff')
|
||||
} else {
|
||||
// Solid color (fallback)
|
||||
expect(routeLayerInfo.value).toBe('#0000ff')
|
||||
}
|
||||
})
|
||||
})
|
||||
|
||||
|
||||
295
lib/tasks/points_raw_data.rake
Normal file
295
lib/tasks/points_raw_data.rake
Normal file
@@ -0,0 +1,295 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
namespace :points do
|
||||
namespace :raw_data do
|
||||
desc 'Restore raw_data from archive to database for a specific month'
|
||||
task :restore, [:user_id, :year, :month] => :environment do |_t, args|
|
||||
validate_args!(args)
|
||||
|
||||
user_id = args[:user_id].to_i
|
||||
year = args[:year].to_i
|
||||
month = args[:month].to_i
|
||||
|
||||
puts '━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━'
|
||||
puts ' Restoring raw_data to DATABASE'
|
||||
puts " User: #{user_id} | Month: #{year}-#{format('%02d', month)}"
|
||||
puts '━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━'
|
||||
puts ''
|
||||
|
||||
restorer = Points::RawData::Restorer.new
|
||||
restorer.restore_to_database(user_id, year, month)
|
||||
|
||||
puts ''
|
||||
puts '✓ Restoration complete!'
|
||||
puts ''
|
||||
puts "Points in #{year}-#{month} now have raw_data in database."
|
||||
puts 'Run VACUUM ANALYZE points; to update statistics.'
|
||||
end
|
||||
|
||||
desc 'Restore raw_data to memory/cache temporarily (for data migrations)'
|
||||
task :restore_temporary, [:user_id, :year, :month] => :environment do |_t, args|
|
||||
validate_args!(args)
|
||||
|
||||
user_id = args[:user_id].to_i
|
||||
year = args[:year].to_i
|
||||
month = args[:month].to_i
|
||||
|
||||
puts '━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━'
|
||||
puts ' Loading raw_data into CACHE (temporary)'
|
||||
puts " User: #{user_id} | Month: #{year}-#{format('%02d', month)}"
|
||||
puts '━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━'
|
||||
puts ''
|
||||
puts 'Data will be available for 1 hour via Point.raw_data_with_archive accessor'
|
||||
puts ''
|
||||
|
||||
restorer = Points::RawData::Restorer.new
|
||||
restorer.restore_to_memory(user_id, year, month)
|
||||
|
||||
puts ''
|
||||
puts '✓ Cache loaded successfully!'
|
||||
puts ''
|
||||
puts 'You can now run your data migration.'
|
||||
puts 'Example:'
|
||||
puts " rails runner \"Point.where(user_id: #{user_id}, timestamp_year: #{year}, timestamp_month: #{month}).find_each { |p| p.fix_coordinates_from_raw_data }\""
|
||||
puts ''
|
||||
puts 'Cache will expire in 1 hour automatically.'
|
||||
end
|
||||
|
||||
desc 'Restore all archived raw_data for a user'
|
||||
task :restore_all, [:user_id] => :environment do |_t, args|
|
||||
raise 'Usage: rake points:raw_data:restore_all[user_id]' unless args[:user_id]
|
||||
|
||||
user_id = args[:user_id].to_i
|
||||
user = User.find(user_id)
|
||||
|
||||
puts '━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━'
|
||||
puts ' Restoring ALL archives for user'
|
||||
puts " #{user.email} (ID: #{user_id})"
|
||||
puts '━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━'
|
||||
puts ''
|
||||
|
||||
archives = Points::RawDataArchive.where(user_id: user_id)
|
||||
.select(:year, :month)
|
||||
.distinct
|
||||
.order(:year, :month)
|
||||
|
||||
puts "Found #{archives.count} months to restore"
|
||||
puts ''
|
||||
|
||||
archives.each_with_index do |archive, idx|
|
||||
puts "[#{idx + 1}/#{archives.count}] Restoring #{archive.year}-#{format('%02d', archive.month)}..."
|
||||
|
||||
restorer = Points::RawData::Restorer.new
|
||||
restorer.restore_to_database(user_id, archive.year, archive.month)
|
||||
end
|
||||
|
||||
puts ''
|
||||
puts "✓ All archives restored for user #{user_id}!"
|
||||
end
|
||||
|
||||
desc 'Show archive statistics'
|
||||
task status: :environment do
|
||||
puts '━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━'
|
||||
puts ' Points raw_data Archive Statistics'
|
||||
puts '━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━'
|
||||
puts ''
|
||||
|
||||
total_archives = Points::RawDataArchive.count
|
||||
verified_archives = Points::RawDataArchive.where.not(verified_at: nil).count
|
||||
unverified_archives = total_archives - verified_archives
|
||||
|
||||
total_points = Point.count
|
||||
archived_points = Point.where(raw_data_archived: true).count
|
||||
cleared_points = Point.where(raw_data_archived: true, raw_data: {}).count
|
||||
archived_not_cleared = archived_points - cleared_points
|
||||
|
||||
percentage = total_points.positive? ? (archived_points.to_f / total_points * 100).round(2) : 0
|
||||
|
||||
puts "Archives: #{total_archives} (#{verified_archives} verified, #{unverified_archives} unverified)"
|
||||
puts "Points archived: #{archived_points} / #{total_points} (#{percentage}%)"
|
||||
puts "Points cleared: #{cleared_points}"
|
||||
puts "Archived but not cleared: #{archived_not_cleared}"
|
||||
puts ''
|
||||
|
||||
# Storage size via ActiveStorage
|
||||
total_blob_size = ActiveStorage::Blob
|
||||
.joins('INNER JOIN active_storage_attachments ON active_storage_attachments.blob_id = active_storage_blobs.id')
|
||||
.where("active_storage_attachments.record_type = 'Points::RawDataArchive'")
|
||||
.sum(:byte_size)
|
||||
|
||||
puts "Storage used: #{ActiveSupport::NumberHelper.number_to_human_size(total_blob_size)}"
|
||||
puts ''
|
||||
|
||||
# Recent activity
|
||||
recent = Points::RawDataArchive.where('archived_at > ?', 7.days.ago).count
|
||||
puts "Archives created last 7 days: #{recent}"
|
||||
puts ''
|
||||
|
||||
# Top users
|
||||
puts 'Top 10 users by archive count:'
|
||||
puts '─────────────────────────────────────────────────'
|
||||
|
||||
Points::RawDataArchive.group(:user_id)
|
||||
.select('user_id, COUNT(*) as archive_count, SUM(point_count) as total_points')
|
||||
.order('archive_count DESC')
|
||||
.limit(10)
|
||||
.each_with_index do |stat, idx|
|
||||
user = User.find(stat.user_id)
|
||||
puts "#{idx + 1}. #{user.email.ljust(30)} #{stat.archive_count.to_s.rjust(3)} archives, #{stat.total_points.to_s.rjust(8)} points"
|
||||
end
|
||||
|
||||
puts ''
|
||||
end
|
||||
|
||||
desc 'Verify archive integrity (all unverified archives, or specific month with args)'
|
||||
task :verify, [:user_id, :year, :month] => :environment do |_t, args|
|
||||
verifier = Points::RawData::Verifier.new
|
||||
|
||||
if args[:user_id] && args[:year] && args[:month]
|
||||
# Verify specific month
|
||||
user_id = args[:user_id].to_i
|
||||
year = args[:year].to_i
|
||||
month = args[:month].to_i
|
||||
|
||||
puts '━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━'
|
||||
puts ' Verifying Archives'
|
||||
puts " User: #{user_id} | Month: #{year}-#{format('%02d', month)}"
|
||||
puts '━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━'
|
||||
puts ''
|
||||
|
||||
verifier.verify_month(user_id, year, month)
|
||||
else
|
||||
# Verify all unverified archives
|
||||
puts '━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━'
|
||||
puts ' Verifying All Unverified Archives'
|
||||
puts '━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━'
|
||||
puts ''
|
||||
|
||||
stats = verifier.call
|
||||
|
||||
puts ''
|
||||
puts "Verified: #{stats[:verified]}"
|
||||
puts "Failed: #{stats[:failed]}"
|
||||
end
|
||||
|
||||
puts ''
|
||||
puts '✓ Verification complete!'
|
||||
end
|
||||
|
||||
desc 'Clear raw_data for verified archives (all verified, or specific month with args)'
|
||||
task :clear_verified, [:user_id, :year, :month] => :environment do |_t, args|
|
||||
clearer = Points::RawData::Clearer.new
|
||||
|
||||
if args[:user_id] && args[:year] && args[:month]
|
||||
# Clear specific month
|
||||
user_id = args[:user_id].to_i
|
||||
year = args[:year].to_i
|
||||
month = args[:month].to_i
|
||||
|
||||
puts '━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━'
|
||||
puts ' Clearing Verified Archives'
|
||||
puts " User: #{user_id} | Month: #{year}-#{format('%02d', month)}"
|
||||
puts '━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━'
|
||||
puts ''
|
||||
|
||||
clearer.clear_month(user_id, year, month)
|
||||
else
|
||||
# Clear all verified archives
|
||||
puts '━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━'
|
||||
puts ' Clearing All Verified Archives'
|
||||
puts '━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━'
|
||||
puts ''
|
||||
|
||||
stats = clearer.call
|
||||
|
||||
puts ''
|
||||
puts "Points cleared: #{stats[:cleared]}"
|
||||
end
|
||||
|
||||
puts ''
|
||||
puts '✓ Clearing complete!'
|
||||
puts ''
|
||||
puts 'Run VACUUM ANALYZE points; to reclaim space and update statistics.'
|
||||
end
|
||||
|
||||
desc 'Archive raw_data for old data (2+ months old, does NOT clear yet)'
|
||||
task archive: :environment do
|
||||
puts '━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━'
|
||||
puts ' Archiving Raw Data (2+ months old data)'
|
||||
puts '━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━'
|
||||
puts ''
|
||||
puts 'This will archive points.raw_data for months 2+ months old.'
|
||||
puts 'Raw data will NOT be cleared yet - use verify and clear_verified tasks.'
|
||||
puts 'This is safe to run multiple times (idempotent).'
|
||||
puts ''
|
||||
|
||||
stats = Points::RawData::Archiver.new.call
|
||||
|
||||
puts ''
|
||||
puts '━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━'
|
||||
puts ' Archival Complete'
|
||||
puts '━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━'
|
||||
puts ''
|
||||
puts "Months processed: #{stats[:processed]}"
|
||||
puts "Points archived: #{stats[:archived]}"
|
||||
puts "Failures: #{stats[:failed]}"
|
||||
puts ''
|
||||
|
||||
return unless stats[:archived].positive?
|
||||
|
||||
puts 'Next steps:'
|
||||
puts '1. Verify archives: rake points:raw_data:verify'
|
||||
puts '2. Clear verified data: rake points:raw_data:clear_verified'
|
||||
puts '3. Check stats: rake points:raw_data:status'
|
||||
end
|
||||
|
||||
desc 'Full workflow: archive + verify + clear (for automated use)'
|
||||
task archive_full: :environment do
|
||||
puts '━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━'
|
||||
puts ' Full Archive Workflow'
|
||||
puts ' (Archive → Verify → Clear)'
|
||||
puts '━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━'
|
||||
puts ''
|
||||
|
||||
# Step 1: Archive
|
||||
puts '▸ Step 1/3: Archiving...'
|
||||
archiver_stats = Points::RawData::Archiver.new.call
|
||||
puts " ✓ Archived #{archiver_stats[:archived]} points"
|
||||
puts ''
|
||||
|
||||
# Step 2: Verify
|
||||
puts '▸ Step 2/3: Verifying...'
|
||||
verifier_stats = Points::RawData::Verifier.new.call
|
||||
puts " ✓ Verified #{verifier_stats[:verified]} archives"
|
||||
if verifier_stats[:failed].positive?
|
||||
puts " ✗ Failed to verify #{verifier_stats[:failed]} archives"
|
||||
puts ''
|
||||
puts '⚠ Some archives failed verification. Data NOT cleared for safety.'
|
||||
puts 'Please investigate failed archives before running clear_verified.'
|
||||
exit 1
|
||||
end
|
||||
puts ''
|
||||
|
||||
# Step 3: Clear
|
||||
puts '▸ Step 3/3: Clearing verified data...'
|
||||
clearer_stats = Points::RawData::Clearer.new.call
|
||||
puts " ✓ Cleared #{clearer_stats[:cleared]} points"
|
||||
puts ''
|
||||
|
||||
puts '━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━'
|
||||
puts ' ✓ Full Archive Workflow Complete!'
|
||||
puts '━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━'
|
||||
puts ''
|
||||
puts 'Run VACUUM ANALYZE points; to reclaim space.'
|
||||
end
|
||||
|
||||
# Alias for backward compatibility
|
||||
task initial_archive: :archive
|
||||
end
|
||||
end
|
||||
|
||||
def validate_args!(args)
|
||||
return if args[:user_id] && args[:year] && args[:month]
|
||||
|
||||
raise 'Usage: rake points:raw_data:TASK[user_id,year,month]'
|
||||
end
|
||||
@@ -2,17 +2,20 @@
|
||||
|
||||
module Timestamps
|
||||
def self.parse_timestamp(timestamp)
|
||||
begin
|
||||
# if the timestamp is in ISO 8601 format, try to parse it
|
||||
DateTime.parse(timestamp).to_time.to_i
|
||||
rescue
|
||||
min_timestamp = Time.zone.parse('1970-01-01').to_i
|
||||
max_timestamp = Time.zone.parse('2100-01-01').to_i
|
||||
|
||||
parsed = DateTime.parse(timestamp).to_time.to_i
|
||||
|
||||
parsed.clamp(min_timestamp, max_timestamp)
|
||||
rescue StandardError
|
||||
result =
|
||||
if timestamp.to_s.length > 10
|
||||
# If the timestamp is in milliseconds, convert to seconds
|
||||
timestamp.to_i / 1000
|
||||
else
|
||||
# If the timestamp is in seconds, return it without change
|
||||
timestamp.to_i
|
||||
end
|
||||
end
|
||||
|
||||
result.clamp(min_timestamp, max_timestamp)
|
||||
end
|
||||
end
|
||||
|
||||
101
spec/controllers/concerns/safe_timestamp_parser_spec.rb
Normal file
101
spec/controllers/concerns/safe_timestamp_parser_spec.rb
Normal file
@@ -0,0 +1,101 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
require 'rails_helper'
|
||||
|
||||
RSpec.describe SafeTimestampParser, type: :controller do
|
||||
include ActiveSupport::Testing::TimeHelpers
|
||||
|
||||
controller(ActionController::Base) do
|
||||
include SafeTimestampParser
|
||||
|
||||
def index
|
||||
render plain: safe_timestamp(params[:date]).to_s
|
||||
end
|
||||
end
|
||||
|
||||
before do
|
||||
routes.draw { get 'index' => 'anonymous#index' }
|
||||
end
|
||||
|
||||
describe '#safe_timestamp' do
|
||||
context 'with valid dates within range' do
|
||||
it 'returns correct timestamp for 2020-01-01' do
|
||||
get :index, params: { date: '2020-01-01' }
|
||||
expected = Time.zone.parse('2020-01-01').to_i
|
||||
expect(response.body).to eq(expected.to_s)
|
||||
end
|
||||
|
||||
it 'returns correct timestamp for 1980-06-15' do
|
||||
get :index, params: { date: '1980-06-15' }
|
||||
expected = Time.zone.parse('1980-06-15').to_i
|
||||
expect(response.body).to eq(expected.to_s)
|
||||
end
|
||||
end
|
||||
|
||||
context 'with dates before valid range' do
|
||||
it 'clamps year 1000 to minimum timestamp (1970-01-01)' do
|
||||
get :index, params: { date: '1000-01-30' }
|
||||
min_timestamp = Time.zone.parse('1970-01-01').to_i
|
||||
expect(response.body).to eq(min_timestamp.to_s)
|
||||
end
|
||||
|
||||
it 'clamps year 1900 to minimum timestamp (1970-01-01)' do
|
||||
get :index, params: { date: '1900-12-25' }
|
||||
min_timestamp = Time.zone.parse('1970-01-01').to_i
|
||||
expect(response.body).to eq(min_timestamp.to_s)
|
||||
end
|
||||
|
||||
it 'clamps year 1969 to minimum timestamp (1970-01-01)' do
|
||||
get :index, params: { date: '1969-07-20' }
|
||||
min_timestamp = Time.zone.parse('1970-01-01').to_i
|
||||
expect(response.body).to eq(min_timestamp.to_s)
|
||||
end
|
||||
end
|
||||
|
||||
context 'with dates after valid range' do
|
||||
it 'clamps year 2150 to maximum timestamp (2100-01-01)' do
|
||||
get :index, params: { date: '2150-01-01' }
|
||||
max_timestamp = Time.zone.parse('2100-01-01').to_i
|
||||
expect(response.body).to eq(max_timestamp.to_s)
|
||||
end
|
||||
|
||||
it 'clamps year 3000 to maximum timestamp (2100-01-01)' do
|
||||
get :index, params: { date: '3000-12-31' }
|
||||
max_timestamp = Time.zone.parse('2100-01-01').to_i
|
||||
expect(response.body).to eq(max_timestamp.to_s)
|
||||
end
|
||||
end
|
||||
|
||||
context 'with invalid date strings' do
|
||||
it 'returns current time for unparseable date' do
|
||||
travel_to Time.zone.parse('2023-06-15 12:00:00') do
|
||||
get :index, params: { date: 'not-a-date' }
|
||||
expected = Time.zone.now.to_i
|
||||
expect(response.body).to eq(expected.to_s)
|
||||
end
|
||||
end
|
||||
|
||||
it 'returns current time for empty string' do
|
||||
travel_to Time.zone.parse('2023-06-15 12:00:00') do
|
||||
get :index, params: { date: '' }
|
||||
expected = Time.zone.now.to_i
|
||||
expect(response.body).to eq(expected.to_s)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
context 'edge cases' do
|
||||
it 'handles Unix epoch exactly (1970-01-01)' do
|
||||
get :index, params: { date: '1970-01-01' }
|
||||
expected = Time.zone.parse('1970-01-01').to_i
|
||||
expect(response.body).to eq(expected.to_s)
|
||||
end
|
||||
|
||||
it 'handles maximum date exactly (2100-01-01)' do
|
||||
get :index, params: { date: '2100-01-01' }
|
||||
expected = Time.zone.parse('2100-01-01').to_i
|
||||
expect(response.body).to eq(expected.to_s)
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
32
spec/factories/points_raw_data_archives.rb
Normal file
32
spec/factories/points_raw_data_archives.rb
Normal file
@@ -0,0 +1,32 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
FactoryBot.define do
|
||||
factory :points_raw_data_archive, class: 'Points::RawDataArchive' do
|
||||
user
|
||||
year { 2024 }
|
||||
month { 6 }
|
||||
chunk_number { 1 }
|
||||
point_count { 100 }
|
||||
point_ids_checksum { Digest::SHA256.hexdigest('1,2,3') }
|
||||
archived_at { Time.current }
|
||||
metadata { { format_version: 1, compression: 'gzip' } }
|
||||
|
||||
after(:build) do |archive|
|
||||
# Attach a test file
|
||||
archive.file.attach(
|
||||
io: StringIO.new(gzip_test_data),
|
||||
filename: archive.filename,
|
||||
content_type: 'application/gzip'
|
||||
)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
def gzip_test_data
|
||||
io = StringIO.new
|
||||
gz = Zlib::GzipWriter.new(io)
|
||||
gz.puts({ id: 1, raw_data: { lon: 13.4, lat: 52.5 } }.to_json)
|
||||
gz.puts({ id: 2, raw_data: { lon: 13.5, lat: 52.6 } }.to_json)
|
||||
gz.close
|
||||
io.string
|
||||
end
|
||||
File diff suppressed because one or more lines are too long
BIN
spec/fixtures/files/kml/points_with_timestamps.kmz
vendored
Normal file
BIN
spec/fixtures/files/kml/points_with_timestamps.kmz
vendored
Normal file
Binary file not shown.
46
spec/jobs/points/raw_data/archive_job_spec.rb
Normal file
46
spec/jobs/points/raw_data/archive_job_spec.rb
Normal file
@@ -0,0 +1,46 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
require 'rails_helper'
|
||||
|
||||
RSpec.describe Points::RawData::ArchiveJob, type: :job do
|
||||
describe '#perform' do
|
||||
let(:archiver) { instance_double(Points::RawData::Archiver) }
|
||||
|
||||
before do
|
||||
# Enable archival for tests
|
||||
allow(ENV).to receive(:[]).and_call_original
|
||||
allow(ENV).to receive(:[]).with('ARCHIVE_RAW_DATA').and_return('true')
|
||||
|
||||
allow(Points::RawData::Archiver).to receive(:new).and_return(archiver)
|
||||
allow(archiver).to receive(:call).and_return({ processed: 5, archived: 100, failed: 0 })
|
||||
end
|
||||
|
||||
it 'calls the archiver service' do
|
||||
expect(archiver).to receive(:call)
|
||||
|
||||
described_class.perform_now
|
||||
end
|
||||
|
||||
context 'when archiver raises an error' do
|
||||
let(:error) { StandardError.new('Archive failed') }
|
||||
|
||||
before do
|
||||
allow(archiver).to receive(:call).and_raise(error)
|
||||
end
|
||||
|
||||
it 're-raises the error' do
|
||||
expect do
|
||||
described_class.perform_now
|
||||
end.to raise_error(StandardError, 'Archive failed')
|
||||
end
|
||||
|
||||
it 'reports the error before re-raising' do
|
||||
expect(ExceptionReporter).to receive(:call).with(error, 'Points raw data archival job failed')
|
||||
|
||||
expect do
|
||||
described_class.perform_now
|
||||
end.to raise_error(StandardError)
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
34
spec/jobs/points/raw_data/re_archive_month_job_spec.rb
Normal file
34
spec/jobs/points/raw_data/re_archive_month_job_spec.rb
Normal file
@@ -0,0 +1,34 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
require 'rails_helper'
|
||||
|
||||
RSpec.describe Points::RawData::ReArchiveMonthJob, type: :job do
|
||||
describe '#perform' do
|
||||
let(:archiver) { instance_double(Points::RawData::Archiver) }
|
||||
let(:user_id) { 123 }
|
||||
let(:year) { 2024 }
|
||||
let(:month) { 6 }
|
||||
|
||||
before do
|
||||
allow(Points::RawData::Archiver).to receive(:new).and_return(archiver)
|
||||
end
|
||||
|
||||
it 'calls archive_specific_month with correct parameters' do
|
||||
expect(archiver).to receive(:archive_specific_month).with(user_id, year, month)
|
||||
|
||||
described_class.perform_now(user_id, year, month)
|
||||
end
|
||||
|
||||
context 'when re-archival fails' do
|
||||
before do
|
||||
allow(archiver).to receive(:archive_specific_month).and_raise(StandardError, 'Re-archive failed')
|
||||
end
|
||||
|
||||
it 're-raises the error' do
|
||||
expect do
|
||||
described_class.perform_now(user_id, year, month)
|
||||
end.to raise_error(StandardError, 'Re-archive failed')
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
116
spec/models/concerns/archivable_spec.rb
Normal file
116
spec/models/concerns/archivable_spec.rb
Normal file
@@ -0,0 +1,116 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
require 'rails_helper'
|
||||
|
||||
RSpec.describe Archivable, type: :model do
|
||||
let(:user) { create(:user) }
|
||||
let(:point) { create(:point, user: user, raw_data: { lon: 13.4, lat: 52.5 }) }
|
||||
|
||||
describe 'associations and scopes' do
|
||||
it { expect(point).to belong_to(:raw_data_archive).optional }
|
||||
|
||||
describe 'scopes' do
|
||||
let!(:archived_point) { create(:point, user: user, raw_data_archived: true) }
|
||||
let!(:not_archived_point) { create(:point, user: user, raw_data_archived: false) }
|
||||
|
||||
it '.archived returns archived points' do
|
||||
expect(Point.archived).to include(archived_point)
|
||||
expect(Point.archived).not_to include(not_archived_point)
|
||||
end
|
||||
|
||||
it '.not_archived returns non-archived points' do
|
||||
expect(Point.not_archived).to include(not_archived_point)
|
||||
expect(Point.not_archived).not_to include(archived_point)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
describe '#raw_data_with_archive' do
|
||||
context 'when raw_data is present in database' do
|
||||
it 'returns raw_data from database' do
|
||||
expect(point.raw_data_with_archive).to eq({ 'lon' => 13.4, 'lat' => 52.5 })
|
||||
end
|
||||
end
|
||||
|
||||
context 'when raw_data is archived' do
|
||||
let(:archive) { create(:points_raw_data_archive, user: user) }
|
||||
let(:archived_point) do
|
||||
create(:point, user: user, raw_data: nil, raw_data_archived: true, raw_data_archive: archive)
|
||||
end
|
||||
|
||||
before do
|
||||
# Mock archive file content with this specific point
|
||||
compressed_data = gzip_data([
|
||||
{ id: archived_point.id, raw_data: { lon: 14.0, lat: 53.0 } }
|
||||
])
|
||||
allow(archive.file.blob).to receive(:download).and_return(compressed_data)
|
||||
end
|
||||
|
||||
it 'fetches raw_data from archive' do
|
||||
result = archived_point.raw_data_with_archive
|
||||
expect(result).to eq({ 'id' => archived_point.id, 'raw_data' => { 'lon' => 14.0, 'lat' => 53.0 } }['raw_data'])
|
||||
end
|
||||
end
|
||||
|
||||
context 'when raw_data is archived but point not in archive' do
|
||||
let(:archive) { create(:points_raw_data_archive, user: user) }
|
||||
let(:archived_point) do
|
||||
create(:point, user: user, raw_data: nil, raw_data_archived: true, raw_data_archive: archive)
|
||||
end
|
||||
|
||||
before do
|
||||
# Mock archive file with different point
|
||||
compressed_data = gzip_data([
|
||||
{ id: 999, raw_data: { lon: 14.0, lat: 53.0 } }
|
||||
])
|
||||
allow(archive.file.blob).to receive(:download).and_return(compressed_data)
|
||||
end
|
||||
|
||||
it 'returns empty hash' do
|
||||
expect(archived_point.raw_data_with_archive).to eq({})
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
describe '#restore_raw_data!' do
|
||||
let(:archive) { create(:points_raw_data_archive, user: user) }
|
||||
let(:archived_point) do
|
||||
create(:point, user: user, raw_data: nil, raw_data_archived: true, raw_data_archive: archive)
|
||||
end
|
||||
|
||||
it 'restores raw_data to database and clears archive flags' do
|
||||
new_data = { lon: 15.0, lat: 54.0 }
|
||||
archived_point.restore_raw_data!(new_data)
|
||||
|
||||
archived_point.reload
|
||||
expect(archived_point.raw_data).to eq(new_data.stringify_keys)
|
||||
expect(archived_point.raw_data_archived).to be false
|
||||
expect(archived_point.raw_data_archive_id).to be_nil
|
||||
end
|
||||
end
|
||||
|
||||
describe 'temporary cache' do
|
||||
let(:june_point) { create(:point, user: user, timestamp: Time.new(2024, 6, 15).to_i) }
|
||||
|
||||
it 'checks temporary restore cache with correct key format' do
|
||||
cache_key = "raw_data:temp:#{user.id}:2024:6:#{june_point.id}"
|
||||
cached_data = { lon: 16.0, lat: 55.0 }
|
||||
|
||||
Rails.cache.write(cache_key, cached_data, expires_in: 1.hour)
|
||||
|
||||
# Access through send since check_temporary_restore_cache is private
|
||||
result = june_point.send(:check_temporary_restore_cache)
|
||||
expect(result).to eq(cached_data)
|
||||
end
|
||||
end
|
||||
|
||||
def gzip_data(points_array)
|
||||
io = StringIO.new
|
||||
gz = Zlib::GzipWriter.new(io)
|
||||
points_array.each do |point_data|
|
||||
gz.puts(point_data.to_json)
|
||||
end
|
||||
gz.close
|
||||
io.string
|
||||
end
|
||||
end
|
||||
86
spec/models/points/raw_data_archive_spec.rb
Normal file
86
spec/models/points/raw_data_archive_spec.rb
Normal file
@@ -0,0 +1,86 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
require 'rails_helper'
|
||||
|
||||
RSpec.describe Points::RawDataArchive, type: :model do
|
||||
let(:user) { create(:user) }
|
||||
subject(:archive) { build(:points_raw_data_archive, user: user) }
|
||||
|
||||
describe 'associations' do
|
||||
it { is_expected.to belong_to(:user) }
|
||||
it { is_expected.to have_many(:points).dependent(:nullify) }
|
||||
end
|
||||
|
||||
describe 'validations' do
|
||||
it { is_expected.to validate_presence_of(:year) }
|
||||
it { is_expected.to validate_presence_of(:month) }
|
||||
it { is_expected.to validate_presence_of(:chunk_number) }
|
||||
it { is_expected.to validate_presence_of(:point_count) }
|
||||
it { is_expected.to validate_presence_of(:point_ids_checksum) }
|
||||
|
||||
it { is_expected.to validate_numericality_of(:year).is_greater_than(1970).is_less_than(2100) }
|
||||
it { is_expected.to validate_numericality_of(:month).is_greater_than_or_equal_to(1).is_less_than_or_equal_to(12) }
|
||||
it { is_expected.to validate_numericality_of(:chunk_number).is_greater_than(0) }
|
||||
|
||||
end
|
||||
|
||||
describe 'scopes' do
|
||||
let!(:recent_archive) { create(:points_raw_data_archive, user: user, year: 2024, month: 5, archived_at: 1.day.ago) }
|
||||
let!(:old_archive) { create(:points_raw_data_archive, user: user, year: 2023, month: 5, archived_at: 2.years.ago) }
|
||||
|
||||
describe '.recent' do
|
||||
it 'returns archives from last 30 days' do
|
||||
expect(described_class.recent).to include(recent_archive)
|
||||
expect(described_class.recent).not_to include(old_archive)
|
||||
end
|
||||
end
|
||||
|
||||
describe '.old' do
|
||||
it 'returns archives older than 1 year' do
|
||||
expect(described_class.old).to include(old_archive)
|
||||
expect(described_class.old).not_to include(recent_archive)
|
||||
end
|
||||
end
|
||||
|
||||
describe '.for_month' do
|
||||
let!(:june_archive) { create(:points_raw_data_archive, user: user, year: 2024, month: 6, chunk_number: 1) }
|
||||
let!(:june_archive_2) { create(:points_raw_data_archive, user: user, year: 2024, month: 6, chunk_number: 2) }
|
||||
let!(:july_archive) { create(:points_raw_data_archive, user: user, year: 2024, month: 7, chunk_number: 1) }
|
||||
|
||||
it 'returns archives for specific month ordered by chunk number' do
|
||||
result = described_class.for_month(user.id, 2024, 6)
|
||||
expect(result.map(&:chunk_number)).to eq([1, 2])
|
||||
expect(result).to include(june_archive, june_archive_2)
|
||||
expect(result).not_to include(july_archive)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
describe '#month_display' do
|
||||
it 'returns formatted month and year' do
|
||||
archive = build(:points_raw_data_archive, year: 2024, month: 6)
|
||||
expect(archive.month_display).to eq('June 2024')
|
||||
end
|
||||
end
|
||||
|
||||
describe '#filename' do
|
||||
it 'generates correct filename with directory structure' do
|
||||
archive = build(:points_raw_data_archive, user_id: 123, year: 2024, month: 6, chunk_number: 5)
|
||||
expect(archive.filename).to eq('raw_data_archives/123/2024/06/005.jsonl.gz')
|
||||
end
|
||||
end
|
||||
|
||||
describe '#size_mb' do
|
||||
it 'returns 0 when no file attached' do
|
||||
archive = build(:points_raw_data_archive)
|
||||
expect(archive.size_mb).to eq(0)
|
||||
end
|
||||
|
||||
it 'returns size in MB when file is attached' do
|
||||
archive = create(:points_raw_data_archive, user: user)
|
||||
# Mock file with 2MB size
|
||||
allow(archive.file.blob).to receive(:byte_size).and_return(2 * 1024 * 1024)
|
||||
expect(archive.size_mb).to eq(2.0)
|
||||
end
|
||||
end
|
||||
end
|
||||
@@ -199,8 +199,8 @@ RSpec.describe User, type: :model do
|
||||
describe '#total_distance' do
|
||||
subject { user.total_distance }
|
||||
|
||||
let!(:stat1) { create(:stat, user:, distance: 10_000) }
|
||||
let!(:stat2) { create(:stat, user:, distance: 20_000) }
|
||||
let!(:stat1) { create(:stat, user:, year: 2020, month: 10, distance: 10_000) }
|
||||
let!(:stat2) { create(:stat, user:, year: 2020, month: 11, distance: 20_000) }
|
||||
|
||||
it 'returns sum of distances' do
|
||||
expect(subject).to eq(30) # 30 km
|
||||
@@ -341,14 +341,16 @@ RSpec.describe User, type: :model do
|
||||
|
||||
describe '.from_omniauth' do
|
||||
let(:auth_hash) do
|
||||
OmniAuth::AuthHash.new({
|
||||
provider: 'github',
|
||||
uid: '123545',
|
||||
info: {
|
||||
email: email,
|
||||
name: 'Test User'
|
||||
OmniAuth::AuthHash.new(
|
||||
{
|
||||
provider: 'github',
|
||||
uid: '123545',
|
||||
info: {
|
||||
email: email,
|
||||
name: 'Test User'
|
||||
}
|
||||
}
|
||||
})
|
||||
)
|
||||
end
|
||||
|
||||
context 'when user exists with the same email' do
|
||||
@@ -394,14 +396,16 @@ RSpec.describe User, type: :model do
|
||||
context 'when OAuth provider is Google' do
|
||||
let(:email) { 'google@example.com' }
|
||||
let(:auth_hash) do
|
||||
OmniAuth::AuthHash.new({
|
||||
provider: 'google_oauth2',
|
||||
uid: '123545',
|
||||
info: {
|
||||
email: email,
|
||||
name: 'Google User'
|
||||
OmniAuth::AuthHash.new(
|
||||
{
|
||||
provider: 'google_oauth2',
|
||||
uid: '123545',
|
||||
info: {
|
||||
email: email,
|
||||
name: 'Google User'
|
||||
}
|
||||
}
|
||||
})
|
||||
)
|
||||
end
|
||||
|
||||
it 'creates a user from Google OAuth data' do
|
||||
|
||||
@@ -5,8 +5,8 @@ require 'rails_helper'
|
||||
RSpec.describe 'Api::V1::Stats', type: :request do
|
||||
describe 'GET /index' do
|
||||
let(:user) { create(:user) }
|
||||
let(:stats_in_2020) { create_list(:stat, 12, year: 2020, user:) }
|
||||
let(:stats_in_2021) { create_list(:stat, 12, year: 2021, user:) }
|
||||
let(:stats_in_2020) { (1..12).map { |month| create(:stat, year: 2020, month:, user:) } }
|
||||
let(:stats_in_2021) { (1..12).map { |month| create(:stat, year: 2021, month:, user:) } }
|
||||
let(:points_in_2020) do
|
||||
(1..85).map do |i|
|
||||
create(:point, :with_geodata,
|
||||
@@ -50,17 +50,17 @@ RSpec.describe 'Api::V1::Stats', type: :request do
|
||||
totalCitiesVisited: 1,
|
||||
monthlyDistanceKm: {
|
||||
january: 1,
|
||||
february: 0,
|
||||
march: 0,
|
||||
april: 0,
|
||||
may: 0,
|
||||
june: 0,
|
||||
july: 0,
|
||||
august: 0,
|
||||
september: 0,
|
||||
october: 0,
|
||||
november: 0,
|
||||
december: 0
|
||||
february: 1,
|
||||
march: 1,
|
||||
april: 1,
|
||||
may: 1,
|
||||
june: 1,
|
||||
july: 1,
|
||||
august: 1,
|
||||
september: 1,
|
||||
october: 1,
|
||||
november: 1,
|
||||
december: 1
|
||||
}
|
||||
},
|
||||
{
|
||||
@@ -70,17 +70,17 @@ RSpec.describe 'Api::V1::Stats', type: :request do
|
||||
totalCitiesVisited: 1,
|
||||
monthlyDistanceKm: {
|
||||
january: 1,
|
||||
february: 0,
|
||||
march: 0,
|
||||
april: 0,
|
||||
may: 0,
|
||||
june: 0,
|
||||
july: 0,
|
||||
august: 0,
|
||||
september: 0,
|
||||
october: 0,
|
||||
november: 0,
|
||||
december: 0
|
||||
february: 1,
|
||||
march: 1,
|
||||
april: 1,
|
||||
may: 1,
|
||||
june: 1,
|
||||
july: 1,
|
||||
august: 1,
|
||||
september: 1,
|
||||
october: 1,
|
||||
november: 1,
|
||||
december: 1
|
||||
}
|
||||
}
|
||||
]
|
||||
@@ -100,4 +100,3 @@ RSpec.describe 'Api::V1::Stats', type: :request do
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
|
||||
@@ -35,7 +35,9 @@ RSpec.describe PointSerializer do
|
||||
'course_accuracy' => point.course_accuracy,
|
||||
'external_track_id' => point.external_track_id,
|
||||
'track_id' => point.track_id,
|
||||
'country_name' => point.read_attribute(:country_name)
|
||||
'country_name' => point.read_attribute(:country_name),
|
||||
'raw_data_archived' => point.raw_data_archived,
|
||||
'raw_data_archive_id' => point.raw_data_archive_id
|
||||
}
|
||||
end
|
||||
|
||||
|
||||
@@ -26,8 +26,8 @@ RSpec.describe StatsSerializer do
|
||||
end
|
||||
|
||||
context 'when the user has stats' do
|
||||
let!(:stats_in_2020) { create_list(:stat, 12, year: 2020, user:) }
|
||||
let!(:stats_in_2021) { create_list(:stat, 12, year: 2021, user:) }
|
||||
let!(:stats_in_2020) { (1..12).map { |month| create(:stat, year: 2020, month:, user:) } }
|
||||
let!(:stats_in_2021) { (1..12).map { |month| create(:stat, year: 2021, month:, user:) } }
|
||||
let!(:points_in_2020) do
|
||||
(1..85).map do |i|
|
||||
create(:point, :with_geodata,
|
||||
@@ -63,17 +63,17 @@ RSpec.describe StatsSerializer do
|
||||
"totalCitiesVisited": 1,
|
||||
"monthlyDistanceKm": {
|
||||
"january": 1,
|
||||
"february": 0,
|
||||
"march": 0,
|
||||
"april": 0,
|
||||
"may": 0,
|
||||
"june": 0,
|
||||
"july": 0,
|
||||
"august": 0,
|
||||
"september": 0,
|
||||
"october": 0,
|
||||
"november": 0,
|
||||
"december": 0
|
||||
"february": 1,
|
||||
"march": 1,
|
||||
"april": 1,
|
||||
"may": 1,
|
||||
"june": 1,
|
||||
"july": 1,
|
||||
"august": 1,
|
||||
"september": 1,
|
||||
"october": 1,
|
||||
"november": 1,
|
||||
"december": 1
|
||||
}
|
||||
},
|
||||
{
|
||||
@@ -83,17 +83,17 @@ RSpec.describe StatsSerializer do
|
||||
"totalCitiesVisited": 1,
|
||||
"monthlyDistanceKm": {
|
||||
"january": 1,
|
||||
"february": 0,
|
||||
"march": 0,
|
||||
"april": 0,
|
||||
"may": 0,
|
||||
"june": 0,
|
||||
"july": 0,
|
||||
"august": 0,
|
||||
"september": 0,
|
||||
"october": 0,
|
||||
"november": 0,
|
||||
"december": 0
|
||||
"february": 1,
|
||||
"march": 1,
|
||||
"april": 1,
|
||||
"may": 1,
|
||||
"june": 1,
|
||||
"july": 1,
|
||||
"august": 1,
|
||||
"september": 1,
|
||||
"october": 1,
|
||||
"november": 1,
|
||||
"december": 1
|
||||
}
|
||||
}
|
||||
]
|
||||
|
||||
@@ -142,6 +142,31 @@ RSpec.describe Kml::Importer do
|
||||
end
|
||||
end
|
||||
|
||||
context 'when importing KMZ file (compressed KML)' do
|
||||
let(:file_path) { Rails.root.join('spec/fixtures/files/kml/points_with_timestamps.kmz').to_s }
|
||||
|
||||
it 'extracts and processes KML from KMZ archive' do
|
||||
expect { parser }.to change(Point, :count).by(3)
|
||||
end
|
||||
|
||||
it 'creates points with correct data from extracted KML' do
|
||||
parser
|
||||
|
||||
point = user.points.order(:timestamp).first
|
||||
|
||||
expect(point.lat).to eq(37.4220)
|
||||
expect(point.lon).to eq(-122.0841)
|
||||
expect(point.altitude).to eq(10)
|
||||
expect(point.timestamp).to eq(Time.zone.parse('2024-01-15T12:00:00Z').to_i)
|
||||
end
|
||||
|
||||
it 'broadcasts importing progress' do
|
||||
expect_any_instance_of(Imports::Broadcaster).to receive(:broadcast_import_progress).at_least(1).time
|
||||
|
||||
parser
|
||||
end
|
||||
end
|
||||
|
||||
context 'when import fails' do
|
||||
let(:file_path) { Rails.root.join('spec/fixtures/files/kml/points_with_timestamps.kml').to_s }
|
||||
|
||||
|
||||
202
spec/services/points/raw_data/archiver_spec.rb
Normal file
202
spec/services/points/raw_data/archiver_spec.rb
Normal file
@@ -0,0 +1,202 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
require 'rails_helper'
|
||||
|
||||
RSpec.describe Points::RawData::Archiver do
|
||||
let(:user) { create(:user) }
|
||||
let(:archiver) { described_class.new }
|
||||
|
||||
before do
|
||||
allow(PointsChannel).to receive(:broadcast_to)
|
||||
end
|
||||
|
||||
describe '#call' do
|
||||
context 'when archival is disabled' do
|
||||
before do
|
||||
allow(ENV).to receive(:[]).and_call_original
|
||||
allow(ENV).to receive(:[]).with('ARCHIVE_RAW_DATA').and_return('false')
|
||||
end
|
||||
|
||||
it 'returns early without processing' do
|
||||
result = archiver.call
|
||||
|
||||
expect(result).to eq({ processed: 0, archived: 0, failed: 0 })
|
||||
end
|
||||
end
|
||||
|
||||
context 'when archival is enabled' do
|
||||
before do
|
||||
allow(ENV).to receive(:[]).and_call_original
|
||||
allow(ENV).to receive(:[]).with('ARCHIVE_RAW_DATA').and_return('true')
|
||||
end
|
||||
|
||||
let!(:old_points) do
|
||||
# Create points 3 months ago (definitely older than 2 month lag)
|
||||
old_date = 3.months.ago.beginning_of_month
|
||||
create_list(:point, 5, user: user,
|
||||
timestamp: old_date.to_i,
|
||||
raw_data: { lon: 13.4, lat: 52.5 })
|
||||
end
|
||||
|
||||
it 'archives old points' do
|
||||
expect { archiver.call }.to change(Points::RawDataArchive, :count).by(1)
|
||||
end
|
||||
|
||||
it 'marks points as archived' do
|
||||
archiver.call
|
||||
|
||||
expect(Point.where(raw_data_archived: true).count).to eq(5)
|
||||
end
|
||||
|
||||
it 'keeps raw_data intact (does not clear yet)' do
|
||||
archiver.call
|
||||
Point.where(user: user).find_each do |point|
|
||||
expect(point.raw_data).to eq({ 'lon' => 13.4, 'lat' => 52.5 })
|
||||
end
|
||||
end
|
||||
|
||||
it 'returns correct stats' do
|
||||
result = archiver.call
|
||||
|
||||
expect(result[:processed]).to eq(1)
|
||||
expect(result[:archived]).to eq(5)
|
||||
expect(result[:failed]).to eq(0)
|
||||
end
|
||||
end
|
||||
|
||||
context 'with points from multiple months' do
|
||||
before do
|
||||
allow(ENV).to receive(:[]).and_call_original
|
||||
allow(ENV).to receive(:[]).with('ARCHIVE_RAW_DATA').and_return('true')
|
||||
end
|
||||
|
||||
let!(:june_points) do
|
||||
june_date = 4.months.ago.beginning_of_month
|
||||
create_list(:point, 3, user: user,
|
||||
timestamp: june_date.to_i,
|
||||
raw_data: { lon: 13.4, lat: 52.5 })
|
||||
end
|
||||
|
||||
let!(:july_points) do
|
||||
july_date = 3.months.ago.beginning_of_month
|
||||
create_list(:point, 2, user: user,
|
||||
timestamp: july_date.to_i,
|
||||
raw_data: { lon: 14.0, lat: 53.0 })
|
||||
end
|
||||
|
||||
it 'creates separate archives for each month' do
|
||||
expect { archiver.call }.to change(Points::RawDataArchive, :count).by(2)
|
||||
end
|
||||
|
||||
it 'archives all points' do
|
||||
archiver.call
|
||||
expect(Point.where(raw_data_archived: true).count).to eq(5)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
describe '#archive_specific_month' do
|
||||
before do
|
||||
allow(ENV).to receive(:[]).and_call_original
|
||||
allow(ENV).to receive(:[]).with('ARCHIVE_RAW_DATA').and_return('true')
|
||||
end
|
||||
|
||||
let(:test_date) { 3.months.ago.beginning_of_month.utc }
|
||||
let!(:june_points) do
|
||||
create_list(:point, 3, user: user,
|
||||
timestamp: test_date.to_i,
|
||||
raw_data: { lon: 13.4, lat: 52.5 })
|
||||
end
|
||||
|
||||
it 'archives specific month' do
|
||||
expect do
|
||||
archiver.archive_specific_month(user.id, test_date.year, test_date.month)
|
||||
end.to change(Points::RawDataArchive, :count).by(1)
|
||||
end
|
||||
|
||||
it 'creates archive with correct metadata' do
|
||||
archiver.archive_specific_month(user.id, test_date.year, test_date.month)
|
||||
|
||||
archive = user.raw_data_archives.last
|
||||
|
||||
expect(archive.user_id).to eq(user.id)
|
||||
expect(archive.year).to eq(test_date.year)
|
||||
expect(archive.month).to eq(test_date.month)
|
||||
expect(archive.point_count).to eq(3)
|
||||
expect(archive.chunk_number).to eq(1)
|
||||
end
|
||||
|
||||
it 'attaches compressed file' do
|
||||
archiver.archive_specific_month(user.id, test_date.year, test_date.month)
|
||||
|
||||
archive = user.raw_data_archives.last
|
||||
expect(archive.file).to be_attached
|
||||
expect(archive.file.key).to match(%r{raw_data_archives/\d+/\d{4}/\d{2}/001\.jsonl\.gz})
|
||||
end
|
||||
end
|
||||
|
||||
describe 'append-only architecture' do
|
||||
before do
|
||||
allow(ENV).to receive(:[]).and_call_original
|
||||
allow(ENV).to receive(:[]).with('ARCHIVE_RAW_DATA').and_return('true')
|
||||
end
|
||||
|
||||
# Use UTC from the start to avoid timezone issues
|
||||
let(:test_date_utc) { 3.months.ago.utc.beginning_of_month }
|
||||
let!(:june_points_batch1) do
|
||||
create_list(:point, 2, user: user,
|
||||
timestamp: test_date_utc.to_i,
|
||||
raw_data: { lon: 13.4, lat: 52.5 })
|
||||
end
|
||||
|
||||
it 'creates additional chunks for same month' do
|
||||
# First archival
|
||||
archiver.archive_specific_month(user.id, test_date_utc.year, test_date_utc.month)
|
||||
expect(Points::RawDataArchive.for_month(user.id, test_date_utc.year, test_date_utc.month).count).to eq(1)
|
||||
expect(Points::RawDataArchive.last.chunk_number).to eq(1)
|
||||
|
||||
# Verify first batch is archived
|
||||
june_points_batch1.each(&:reload)
|
||||
expect(june_points_batch1.all?(&:raw_data_archived)).to be true
|
||||
|
||||
# Add more points for same month (retrospective import)
|
||||
# Use unique timestamps to avoid uniqueness validation errors
|
||||
mid_month = test_date_utc + 15.days
|
||||
june_points_batch2 = [
|
||||
create(:point, user: user, timestamp: mid_month.to_i, raw_data: { lon: 14.0, lat: 53.0 }),
|
||||
create(:point, user: user, timestamp: (mid_month + 1.hour).to_i, raw_data: { lon: 14.0, lat: 53.0 })
|
||||
]
|
||||
|
||||
# Verify second batch exists and is not archived
|
||||
expect(june_points_batch2.all? { |p| !p.raw_data_archived }).to be true
|
||||
|
||||
# Second archival should create chunk 2
|
||||
archiver.archive_specific_month(user.id, test_date_utc.year, test_date_utc.month)
|
||||
expect(Points::RawDataArchive.for_month(user.id, test_date_utc.year, test_date_utc.month).count).to eq(2)
|
||||
expect(Points::RawDataArchive.last.chunk_number).to eq(2)
|
||||
end
|
||||
end
|
||||
|
||||
describe 'advisory locking' do
|
||||
before do
|
||||
allow(ENV).to receive(:[]).and_call_original
|
||||
allow(ENV).to receive(:[]).with('ARCHIVE_RAW_DATA').and_return('true')
|
||||
end
|
||||
|
||||
let!(:june_points) do
|
||||
old_date = 3.months.ago.beginning_of_month
|
||||
create_list(:point, 2, user: user,
|
||||
timestamp: old_date.to_i,
|
||||
raw_data: { lon: 13.4, lat: 52.5 })
|
||||
end
|
||||
|
||||
it 'prevents duplicate processing with advisory locks' do
|
||||
# Simulate lock couldn't be acquired (returns nil/false)
|
||||
allow(ActiveRecord::Base).to receive(:with_advisory_lock).and_return(false)
|
||||
|
||||
result = archiver.call
|
||||
expect(result[:processed]).to eq(0)
|
||||
expect(result[:failed]).to eq(0)
|
||||
end
|
||||
end
|
||||
end
|
||||
94
spec/services/points/raw_data/chunk_compressor_spec.rb
Normal file
94
spec/services/points/raw_data/chunk_compressor_spec.rb
Normal file
@@ -0,0 +1,94 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
require 'rails_helper'
|
||||
|
||||
RSpec.describe Points::RawData::ChunkCompressor do
|
||||
let(:user) { create(:user) }
|
||||
|
||||
before do
|
||||
# Stub broadcasting to avoid ActionCable issues in tests
|
||||
allow(PointsChannel).to receive(:broadcast_to)
|
||||
end
|
||||
let(:points) do
|
||||
[
|
||||
create(:point, user: user, raw_data: { lon: 13.4, lat: 52.5 }),
|
||||
create(:point, user: user, raw_data: { lon: 13.5, lat: 52.6 }),
|
||||
create(:point, user: user, raw_data: { lon: 13.6, lat: 52.7 })
|
||||
]
|
||||
end
|
||||
let(:compressor) { described_class.new(Point.where(id: points.map(&:id))) }
|
||||
|
||||
describe '#compress' do
|
||||
it 'returns compressed gzip data' do
|
||||
result = compressor.compress
|
||||
expect(result).to be_a(String)
|
||||
expect(result.encoding.name).to eq('ASCII-8BIT')
|
||||
end
|
||||
|
||||
it 'compresses points as JSONL format' do
|
||||
compressed = compressor.compress
|
||||
|
||||
# Decompress and verify format
|
||||
io = StringIO.new(compressed)
|
||||
gz = Zlib::GzipReader.new(io)
|
||||
lines = gz.readlines
|
||||
gz.close
|
||||
|
||||
expect(lines.count).to eq(3)
|
||||
|
||||
# Each line should be valid JSON
|
||||
lines.each_with_index do |line, index|
|
||||
data = JSON.parse(line)
|
||||
expect(data).to have_key('id')
|
||||
expect(data).to have_key('raw_data')
|
||||
expect(data['id']).to eq(points[index].id)
|
||||
end
|
||||
end
|
||||
|
||||
it 'includes point ID and raw_data in each line' do
|
||||
compressed = compressor.compress
|
||||
|
||||
io = StringIO.new(compressed)
|
||||
gz = Zlib::GzipReader.new(io)
|
||||
first_line = gz.readline
|
||||
gz.close
|
||||
|
||||
data = JSON.parse(first_line)
|
||||
expect(data['id']).to eq(points.first.id)
|
||||
expect(data['raw_data']).to eq({ 'lon' => 13.4, 'lat' => 52.5 })
|
||||
end
|
||||
|
||||
it 'processes points in batches' do
|
||||
# Create many points to test batch processing with unique timestamps
|
||||
many_points = []
|
||||
base_time = Time.new(2024, 6, 15).to_i
|
||||
2500.times do |i|
|
||||
many_points << create(:point, user: user, timestamp: base_time + i, raw_data: { lon: 13.4, lat: 52.5 })
|
||||
end
|
||||
large_compressor = described_class.new(Point.where(id: many_points.map(&:id)))
|
||||
|
||||
compressed = large_compressor.compress
|
||||
|
||||
io = StringIO.new(compressed)
|
||||
gz = Zlib::GzipReader.new(io)
|
||||
line_count = 0
|
||||
gz.each_line { line_count += 1 }
|
||||
gz.close
|
||||
|
||||
expect(line_count).to eq(2500)
|
||||
end
|
||||
|
||||
it 'produces smaller compressed output than uncompressed' do
|
||||
compressed = compressor.compress
|
||||
|
||||
# Decompress to get original size
|
||||
io = StringIO.new(compressed)
|
||||
gz = Zlib::GzipReader.new(io)
|
||||
decompressed = gz.read
|
||||
gz.close
|
||||
|
||||
# Compressed should be smaller
|
||||
expect(compressed.bytesize).to be < decompressed.bytesize
|
||||
end
|
||||
end
|
||||
end
|
||||
165
spec/services/points/raw_data/clearer_spec.rb
Normal file
165
spec/services/points/raw_data/clearer_spec.rb
Normal file
@@ -0,0 +1,165 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
require 'rails_helper'
|
||||
|
||||
RSpec.describe Points::RawData::Clearer do
|
||||
let(:user) { create(:user) }
|
||||
let(:clearer) { described_class.new }
|
||||
|
||||
before do
|
||||
allow(PointsChannel).to receive(:broadcast_to)
|
||||
end
|
||||
|
||||
describe '#clear_specific_archive' do
|
||||
let(:test_date) { 3.months.ago.beginning_of_month.utc }
|
||||
let!(:points) do
|
||||
create_list(:point, 5, user: user,
|
||||
timestamp: test_date.to_i,
|
||||
raw_data: { lon: 13.4, lat: 52.5 })
|
||||
end
|
||||
|
||||
let(:archive) do
|
||||
# Create and verify archive
|
||||
archiver = Points::RawData::Archiver.new
|
||||
archiver.archive_specific_month(user.id, test_date.year, test_date.month)
|
||||
|
||||
archive = Points::RawDataArchive.last
|
||||
verifier = Points::RawData::Verifier.new
|
||||
verifier.verify_specific_archive(archive.id)
|
||||
|
||||
archive.reload
|
||||
end
|
||||
|
||||
it 'clears raw_data for verified archive' do
|
||||
expect(Point.where(user: user).pluck(:raw_data)).to all(eq({ 'lon' => 13.4, 'lat' => 52.5 }))
|
||||
|
||||
clearer.clear_specific_archive(archive.id)
|
||||
|
||||
expect(Point.where(user: user).pluck(:raw_data)).to all(eq({}))
|
||||
end
|
||||
|
||||
it 'does not clear unverified archive' do
|
||||
# Create unverified archive
|
||||
archiver = Points::RawData::Archiver.new
|
||||
mid_month = test_date + 15.days
|
||||
create_list(:point, 3, user: user,
|
||||
timestamp: mid_month.to_i,
|
||||
raw_data: { lon: 14.0, lat: 53.0 })
|
||||
archiver.archive_specific_month(user.id, test_date.year, test_date.month)
|
||||
|
||||
unverified_archive = Points::RawDataArchive.where(verified_at: nil).last
|
||||
|
||||
result = clearer.clear_specific_archive(unverified_archive.id)
|
||||
|
||||
expect(result[:cleared]).to eq(0)
|
||||
end
|
||||
|
||||
it 'is idempotent (safe to run multiple times)' do
|
||||
clearer.clear_specific_archive(archive.id)
|
||||
first_result = Point.where(user: user).pluck(:raw_data)
|
||||
|
||||
clearer.clear_specific_archive(archive.id)
|
||||
second_result = Point.where(user: user).pluck(:raw_data)
|
||||
|
||||
expect(first_result).to eq(second_result)
|
||||
expect(first_result).to all(eq({}))
|
||||
end
|
||||
end
|
||||
|
||||
describe '#clear_month' do
|
||||
let(:test_date) { 3.months.ago.beginning_of_month.utc }
|
||||
|
||||
before do
|
||||
# Create points and archive
|
||||
create_list(:point, 5, user: user,
|
||||
timestamp: test_date.to_i,
|
||||
raw_data: { lon: 13.4, lat: 52.5 })
|
||||
|
||||
archiver = Points::RawData::Archiver.new
|
||||
archiver.archive_specific_month(user.id, test_date.year, test_date.month)
|
||||
|
||||
# Verify archive
|
||||
verifier = Points::RawData::Verifier.new
|
||||
verifier.verify_month(user.id, test_date.year, test_date.month)
|
||||
end
|
||||
|
||||
it 'clears all verified archives for a month' do
|
||||
expect(Point.where(user: user, raw_data: {}).count).to eq(0)
|
||||
|
||||
clearer.clear_month(user.id, test_date.year, test_date.month)
|
||||
|
||||
expect(Point.where(user: user, raw_data: {}).count).to eq(5)
|
||||
end
|
||||
end
|
||||
|
||||
describe '#call' do
|
||||
let(:test_date) { 3.months.ago.beginning_of_month.utc }
|
||||
|
||||
before do
|
||||
# Create points and archive
|
||||
create_list(:point, 5, user: user,
|
||||
timestamp: test_date.to_i,
|
||||
raw_data: { lon: 13.4, lat: 52.5 })
|
||||
|
||||
archiver = Points::RawData::Archiver.new
|
||||
archiver.archive_specific_month(user.id, test_date.year, test_date.month)
|
||||
|
||||
# Verify archive
|
||||
verifier = Points::RawData::Verifier.new
|
||||
verifier.verify_month(user.id, test_date.year, test_date.month)
|
||||
end
|
||||
|
||||
it 'clears all verified archives' do
|
||||
expect(Point.where(raw_data: {}).count).to eq(0)
|
||||
|
||||
result = clearer.call
|
||||
|
||||
expect(result[:cleared]).to eq(5)
|
||||
expect(Point.where(raw_data: {}).count).to eq(5)
|
||||
end
|
||||
|
||||
it 'skips unverified archives' do
|
||||
# Create another month without verifying
|
||||
new_date = 4.months.ago.beginning_of_month.utc
|
||||
create_list(:point, 3, user: user,
|
||||
timestamp: new_date.to_i,
|
||||
raw_data: { lon: 14.0, lat: 53.0 })
|
||||
|
||||
archiver = Points::RawData::Archiver.new
|
||||
archiver.archive_specific_month(user.id, new_date.year, new_date.month)
|
||||
|
||||
result = clearer.call
|
||||
|
||||
# Should only clear the verified month (5 points)
|
||||
expect(result[:cleared]).to eq(5)
|
||||
|
||||
# Unverified month should still have raw_data
|
||||
unverified_points = Point.where(user: user)
|
||||
.where("timestamp >= ? AND timestamp < ?",
|
||||
new_date.to_i,
|
||||
(new_date + 1.month).to_i)
|
||||
expect(unverified_points.pluck(:raw_data)).to all(eq({ 'lon' => 14.0, 'lat' => 53.0 }))
|
||||
end
|
||||
|
||||
it 'is idempotent (safe to run multiple times)' do
|
||||
first_result = clearer.call
|
||||
|
||||
# Use a new instance for second call
|
||||
new_clearer = Points::RawData::Clearer.new
|
||||
second_result = new_clearer.call
|
||||
|
||||
expect(first_result[:cleared]).to eq(5)
|
||||
expect(second_result[:cleared]).to eq(0) # Already cleared
|
||||
end
|
||||
|
||||
it 'handles large batches' do
|
||||
# Stub batch size to test batching logic
|
||||
stub_const('Points::RawData::Clearer::BATCH_SIZE', 2)
|
||||
|
||||
result = clearer.call
|
||||
|
||||
expect(result[:cleared]).to eq(5)
|
||||
expect(Point.where(raw_data: {}).count).to eq(5)
|
||||
end
|
||||
end
|
||||
end
|
||||
228
spec/services/points/raw_data/restorer_spec.rb
Normal file
228
spec/services/points/raw_data/restorer_spec.rb
Normal file
@@ -0,0 +1,228 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
require 'rails_helper'
|
||||
|
||||
RSpec.describe Points::RawData::Restorer do
|
||||
let(:user) { create(:user) }
|
||||
let(:restorer) { described_class.new }
|
||||
|
||||
before do
|
||||
# Stub broadcasting to avoid ActionCable issues in tests
|
||||
allow(PointsChannel).to receive(:broadcast_to)
|
||||
end
|
||||
|
||||
describe '#restore_to_database' do
|
||||
let!(:archived_points) do
|
||||
create_list(:point, 3, user: user, timestamp: Time.new(2024, 6, 15).to_i,
|
||||
raw_data: nil, raw_data_archived: true)
|
||||
end
|
||||
|
||||
let(:archive) do
|
||||
# Create archive with actual point data
|
||||
compressed_data = gzip_points_data(archived_points.map do |p|
|
||||
{ id: p.id, raw_data: { lon: 13.4, lat: 52.5 } }
|
||||
end)
|
||||
|
||||
arc = build(:points_raw_data_archive, user: user, year: 2024, month: 6)
|
||||
arc.file.attach(
|
||||
io: StringIO.new(compressed_data),
|
||||
filename: arc.filename,
|
||||
content_type: 'application/gzip'
|
||||
)
|
||||
arc.save!
|
||||
|
||||
# Associate points with archive
|
||||
archived_points.each { |p| p.update!(raw_data_archive: arc) }
|
||||
|
||||
arc
|
||||
end
|
||||
|
||||
it 'restores raw_data to database' do
|
||||
archive # Ensure archive is created before restore
|
||||
restorer.restore_to_database(user.id, 2024, 6)
|
||||
|
||||
archived_points.each(&:reload)
|
||||
archived_points.each do |point|
|
||||
expect(point.raw_data).to eq({ 'lon' => 13.4, 'lat' => 52.5 })
|
||||
end
|
||||
end
|
||||
|
||||
it 'clears archive flags' do
|
||||
archive # Ensure archive is created before restore
|
||||
restorer.restore_to_database(user.id, 2024, 6)
|
||||
|
||||
archived_points.each(&:reload)
|
||||
archived_points.each do |point|
|
||||
expect(point.raw_data_archived).to be false
|
||||
expect(point.raw_data_archive_id).to be_nil
|
||||
end
|
||||
end
|
||||
|
||||
it 'raises error when no archives found' do
|
||||
expect do
|
||||
restorer.restore_to_database(user.id, 2025, 12)
|
||||
end.to raise_error(/No archives found/)
|
||||
end
|
||||
|
||||
context 'with multiple chunks' do
|
||||
let!(:more_points) do
|
||||
create_list(:point, 2, user: user, timestamp: Time.new(2024, 6, 20).to_i,
|
||||
raw_data: nil, raw_data_archived: true)
|
||||
end
|
||||
|
||||
let!(:archive2) do
|
||||
compressed_data = gzip_points_data(more_points.map do |p|
|
||||
{ id: p.id, raw_data: { lon: 14.0, lat: 53.0 } }
|
||||
end)
|
||||
|
||||
arc = build(:points_raw_data_archive, user: user, year: 2024, month: 6, chunk_number: 2)
|
||||
arc.file.attach(
|
||||
io: StringIO.new(compressed_data),
|
||||
filename: arc.filename,
|
||||
content_type: 'application/gzip'
|
||||
)
|
||||
arc.save!
|
||||
|
||||
more_points.each { |p| p.update!(raw_data_archive: arc) }
|
||||
|
||||
arc
|
||||
end
|
||||
|
||||
it 'restores from all chunks' do
|
||||
archive # Ensure first archive is created
|
||||
archive2 # Ensure second archive is created
|
||||
restorer.restore_to_database(user.id, 2024, 6)
|
||||
|
||||
(archived_points + more_points).each(&:reload)
|
||||
expect(archived_points.first.raw_data).to eq({ 'lon' => 13.4, 'lat' => 52.5 })
|
||||
expect(more_points.first.raw_data).to eq({ 'lon' => 14.0, 'lat' => 53.0 })
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
describe '#restore_to_memory' do
|
||||
let!(:archived_points) do
|
||||
create_list(:point, 2, user: user, timestamp: Time.new(2024, 6, 15).to_i,
|
||||
raw_data: nil, raw_data_archived: true)
|
||||
end
|
||||
|
||||
let(:archive) do
|
||||
compressed_data = gzip_points_data(archived_points.map do |p|
|
||||
{ id: p.id, raw_data: { lon: 13.4, lat: 52.5 } }
|
||||
end)
|
||||
|
||||
arc = build(:points_raw_data_archive, user: user, year: 2024, month: 6)
|
||||
arc.file.attach(
|
||||
io: StringIO.new(compressed_data),
|
||||
filename: arc.filename,
|
||||
content_type: 'application/gzip'
|
||||
)
|
||||
arc.save!
|
||||
|
||||
archived_points.each { |p| p.update!(raw_data_archive: arc) }
|
||||
|
||||
arc
|
||||
end
|
||||
|
||||
it 'loads data into cache' do
|
||||
archive # Ensure archive is created before restore
|
||||
restorer.restore_to_memory(user.id, 2024, 6)
|
||||
|
||||
archived_points.each do |point|
|
||||
cache_key = "raw_data:temp:#{user.id}:2024:6:#{point.id}"
|
||||
cached_value = Rails.cache.read(cache_key)
|
||||
expect(cached_value).to eq({ 'lon' => 13.4, 'lat' => 52.5 })
|
||||
end
|
||||
end
|
||||
|
||||
it 'does not modify database' do
|
||||
archive # Ensure archive is created before restore
|
||||
restorer.restore_to_memory(user.id, 2024, 6)
|
||||
|
||||
archived_points.each(&:reload)
|
||||
archived_points.each do |point|
|
||||
expect(point.raw_data).to be_nil
|
||||
expect(point.raw_data_archived).to be true
|
||||
end
|
||||
end
|
||||
|
||||
it 'sets cache expiration to 1 hour' do
|
||||
archive # Ensure archive is created before restore
|
||||
restorer.restore_to_memory(user.id, 2024, 6)
|
||||
|
||||
cache_key = "raw_data:temp:#{user.id}:2024:6:#{archived_points.first.id}"
|
||||
|
||||
# Cache should exist now
|
||||
expect(Rails.cache.exist?(cache_key)).to be true
|
||||
end
|
||||
end
|
||||
|
||||
describe '#restore_all_for_user' do
|
||||
let!(:june_points) do
|
||||
create_list(:point, 2, user: user, timestamp: Time.new(2024, 6, 15).to_i,
|
||||
raw_data: nil, raw_data_archived: true)
|
||||
end
|
||||
|
||||
let!(:july_points) do
|
||||
create_list(:point, 2, user: user, timestamp: Time.new(2024, 7, 15).to_i,
|
||||
raw_data: nil, raw_data_archived: true)
|
||||
end
|
||||
|
||||
let!(:june_archive) do
|
||||
compressed_data = gzip_points_data(june_points.map { |p| { id: p.id, raw_data: { month: 'june' } } })
|
||||
|
||||
arc = build(:points_raw_data_archive, user: user, year: 2024, month: 6)
|
||||
arc.file.attach(
|
||||
io: StringIO.new(compressed_data),
|
||||
filename: arc.filename,
|
||||
content_type: 'application/gzip'
|
||||
)
|
||||
arc.save!
|
||||
|
||||
june_points.each { |p| p.update!(raw_data_archive: arc) }
|
||||
arc
|
||||
end
|
||||
|
||||
let!(:july_archive) do
|
||||
compressed_data = gzip_points_data(july_points.map { |p| { id: p.id, raw_data: { month: 'july' } } })
|
||||
|
||||
arc = build(:points_raw_data_archive, user: user, year: 2024, month: 7)
|
||||
arc.file.attach(
|
||||
io: StringIO.new(compressed_data),
|
||||
filename: arc.filename,
|
||||
content_type: 'application/gzip'
|
||||
)
|
||||
arc.save!
|
||||
|
||||
july_points.each { |p| p.update!(raw_data_archive: arc) }
|
||||
arc
|
||||
end
|
||||
|
||||
it 'restores all months for user' do
|
||||
restorer.restore_all_for_user(user.id)
|
||||
|
||||
june_points.each(&:reload)
|
||||
july_points.each(&:reload)
|
||||
|
||||
expect(june_points.first.raw_data).to eq({ 'month' => 'june' })
|
||||
expect(july_points.first.raw_data).to eq({ 'month' => 'july' })
|
||||
end
|
||||
|
||||
it 'clears all archive flags' do
|
||||
restorer.restore_all_for_user(user.id)
|
||||
|
||||
(june_points + july_points).each(&:reload)
|
||||
expect(Point.where(user: user, raw_data_archived: true).count).to eq(0)
|
||||
end
|
||||
end
|
||||
|
||||
def gzip_points_data(points_array)
|
||||
io = StringIO.new
|
||||
gz = Zlib::GzipWriter.new(io)
|
||||
points_array.each do |point_data|
|
||||
gz.puts(point_data.to_json)
|
||||
end
|
||||
gz.close
|
||||
io.string
|
||||
end
|
||||
end
|
||||
166
spec/services/points/raw_data/verifier_spec.rb
Normal file
166
spec/services/points/raw_data/verifier_spec.rb
Normal file
@@ -0,0 +1,166 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
require 'rails_helper'
|
||||
|
||||
RSpec.describe Points::RawData::Verifier do
|
||||
let(:user) { create(:user) }
|
||||
let(:verifier) { described_class.new }
|
||||
|
||||
before do
|
||||
allow(PointsChannel).to receive(:broadcast_to)
|
||||
end
|
||||
|
||||
describe '#verify_specific_archive' do
|
||||
let(:test_date) { 3.months.ago.beginning_of_month.utc }
|
||||
let!(:points) do
|
||||
create_list(:point, 5, user: user,
|
||||
timestamp: test_date.to_i,
|
||||
raw_data: { lon: 13.4, lat: 52.5 })
|
||||
end
|
||||
|
||||
let(:archive) do
|
||||
# Create archive
|
||||
archiver = Points::RawData::Archiver.new
|
||||
archiver.archive_specific_month(user.id, test_date.year, test_date.month)
|
||||
Points::RawDataArchive.last
|
||||
end
|
||||
|
||||
it 'verifies a valid archive successfully' do
|
||||
expect(archive.verified_at).to be_nil
|
||||
|
||||
verifier.verify_specific_archive(archive.id)
|
||||
archive.reload
|
||||
|
||||
expect(archive.verified_at).to be_present
|
||||
end
|
||||
|
||||
it 'detects missing file' do
|
||||
archive.file.purge
|
||||
archive.reload
|
||||
|
||||
expect do
|
||||
verifier.verify_specific_archive(archive.id)
|
||||
end.not_to change { archive.reload.verified_at }
|
||||
end
|
||||
|
||||
it 'detects point count mismatch' do
|
||||
# Tamper with point count
|
||||
archive.update_column(:point_count, 999)
|
||||
|
||||
expect do
|
||||
verifier.verify_specific_archive(archive.id)
|
||||
end.not_to change { archive.reload.verified_at }
|
||||
end
|
||||
|
||||
it 'detects checksum mismatch' do
|
||||
# Tamper with checksum
|
||||
archive.update_column(:point_ids_checksum, 'invalid')
|
||||
|
||||
expect do
|
||||
verifier.verify_specific_archive(archive.id)
|
||||
end.not_to change { archive.reload.verified_at }
|
||||
end
|
||||
|
||||
it 'detects deleted points' do
|
||||
# Force archive creation first
|
||||
archive_id = archive.id
|
||||
|
||||
# Then delete one point from database
|
||||
points.first.destroy
|
||||
|
||||
expect do
|
||||
verifier.verify_specific_archive(archive_id)
|
||||
end.not_to change { archive.reload.verified_at }
|
||||
end
|
||||
|
||||
it 'detects raw_data mismatch between archive and database' do
|
||||
# Force archive creation first
|
||||
archive_id = archive.id
|
||||
|
||||
# Then modify raw_data in database after archiving
|
||||
points.first.update_column(:raw_data, { lon: 999.0, lat: 999.0 })
|
||||
|
||||
expect do
|
||||
verifier.verify_specific_archive(archive_id)
|
||||
end.not_to change { archive.reload.verified_at }
|
||||
end
|
||||
|
||||
it 'verifies raw_data matches between archive and database' do
|
||||
# Ensure data hasn't changed
|
||||
expect(points.first.raw_data).to eq({ 'lon' => 13.4, 'lat' => 52.5 })
|
||||
|
||||
verifier.verify_specific_archive(archive.id)
|
||||
|
||||
expect(archive.reload.verified_at).to be_present
|
||||
end
|
||||
end
|
||||
|
||||
describe '#verify_month' do
|
||||
let(:test_date) { 3.months.ago.beginning_of_month.utc }
|
||||
|
||||
before do
|
||||
# Create points
|
||||
create_list(:point, 5, user: user,
|
||||
timestamp: test_date.to_i,
|
||||
raw_data: { lon: 13.4, lat: 52.5 })
|
||||
|
||||
# Archive them
|
||||
archiver = Points::RawData::Archiver.new
|
||||
archiver.archive_specific_month(user.id, test_date.year, test_date.month)
|
||||
end
|
||||
|
||||
it 'verifies all archives for a month' do
|
||||
expect(Points::RawDataArchive.where(verified_at: nil).count).to eq(1)
|
||||
|
||||
verifier.verify_month(user.id, test_date.year, test_date.month)
|
||||
|
||||
expect(Points::RawDataArchive.where(verified_at: nil).count).to eq(0)
|
||||
end
|
||||
end
|
||||
|
||||
describe '#call' do
|
||||
let(:test_date) { 3.months.ago.beginning_of_month.utc }
|
||||
|
||||
before do
|
||||
# Create points and archive
|
||||
create_list(:point, 5, user: user,
|
||||
timestamp: test_date.to_i,
|
||||
raw_data: { lon: 13.4, lat: 52.5 })
|
||||
|
||||
archiver = Points::RawData::Archiver.new
|
||||
archiver.archive_specific_month(user.id, test_date.year, test_date.month)
|
||||
end
|
||||
|
||||
it 'verifies all unverified archives' do
|
||||
expect(Points::RawDataArchive.where(verified_at: nil).count).to eq(1)
|
||||
|
||||
result = verifier.call
|
||||
|
||||
expect(result[:verified]).to eq(1)
|
||||
expect(result[:failed]).to eq(0)
|
||||
expect(Points::RawDataArchive.where(verified_at: nil).count).to eq(0)
|
||||
end
|
||||
|
||||
it 'reports failures' do
|
||||
# Tamper with archive
|
||||
Points::RawDataArchive.last.update_column(:point_count, 999)
|
||||
|
||||
result = verifier.call
|
||||
|
||||
expect(result[:verified]).to eq(0)
|
||||
expect(result[:failed]).to eq(1)
|
||||
end
|
||||
|
||||
it 'skips already verified archives' do
|
||||
# Verify once
|
||||
verifier.call
|
||||
|
||||
# Try to verify again with a new verifier instance
|
||||
new_verifier = Points::RawData::Verifier.new
|
||||
result = new_verifier.call
|
||||
|
||||
expect(result[:verified]).to eq(0)
|
||||
expect(result[:failed]).to eq(0)
|
||||
end
|
||||
end
|
||||
end
|
||||
@@ -93,6 +93,114 @@ RSpec.describe Stats::CalculateMonth do
|
||||
expect(user.stats.last.distance).to be_within(1000).of(340_000)
|
||||
end
|
||||
end
|
||||
|
||||
context 'when calculating visited cities and countries' do
|
||||
let(:timestamp_base) { DateTime.new(year, month, 1, 12).to_i }
|
||||
let!(:import) { create(:import, user:) }
|
||||
|
||||
context 'when user spent more than MIN_MINUTES_SPENT_IN_CITY in a city' do
|
||||
let!(:berlin_points) do
|
||||
[
|
||||
create(:point, user:, import:, timestamp: timestamp_base,
|
||||
city: 'Berlin', country_name: 'Germany',
|
||||
lonlat: 'POINT(13.404954 52.520008)'),
|
||||
create(:point, user:, import:, timestamp: timestamp_base + 30.minutes,
|
||||
city: 'Berlin', country_name: 'Germany',
|
||||
lonlat: 'POINT(13.404954 52.520008)'),
|
||||
create(:point, user:, import:, timestamp: timestamp_base + 70.minutes,
|
||||
city: 'Berlin', country_name: 'Germany',
|
||||
lonlat: 'POINT(13.404954 52.520008)')
|
||||
]
|
||||
end
|
||||
|
||||
it 'includes the city in toponyms' do
|
||||
calculate_stats
|
||||
|
||||
stat = user.stats.last
|
||||
expect(stat.toponyms).not_to be_empty
|
||||
expect(stat.toponyms.first['country']).to eq('Germany')
|
||||
expect(stat.toponyms.first['cities']).not_to be_empty
|
||||
expect(stat.toponyms.first['cities'].first['city']).to eq('Berlin')
|
||||
end
|
||||
end
|
||||
|
||||
context 'when user spent less than MIN_MINUTES_SPENT_IN_CITY in a city' do
|
||||
let!(:prague_points) do
|
||||
[
|
||||
create(:point, user:, import:, timestamp: timestamp_base,
|
||||
city: 'Prague', country_name: 'Czech Republic',
|
||||
lonlat: 'POINT(14.4378 50.0755)'),
|
||||
create(:point, user:, import:, timestamp: timestamp_base + 10.minutes,
|
||||
city: 'Prague', country_name: 'Czech Republic',
|
||||
lonlat: 'POINT(14.4378 50.0755)'),
|
||||
create(:point, user:, import:, timestamp: timestamp_base + 20.minutes,
|
||||
city: 'Prague', country_name: 'Czech Republic',
|
||||
lonlat: 'POINT(14.4378 50.0755)')
|
||||
]
|
||||
end
|
||||
|
||||
it 'excludes the city from toponyms' do
|
||||
calculate_stats
|
||||
|
||||
stat = user.stats.last
|
||||
expect(stat.toponyms).not_to be_empty
|
||||
|
||||
# Country should be listed but with no cities
|
||||
czech_country = stat.toponyms.find { |t| t['country'] == 'Czech Republic' }
|
||||
expect(czech_country).not_to be_nil
|
||||
expect(czech_country['cities']).to be_empty
|
||||
end
|
||||
end
|
||||
|
||||
context 'when user visited multiple cities with mixed durations' do
|
||||
let!(:mixed_points) do
|
||||
[
|
||||
# Berlin: 70 minutes (should be included)
|
||||
create(:point, user:, import:, timestamp: timestamp_base,
|
||||
city: 'Berlin', country_name: 'Germany',
|
||||
lonlat: 'POINT(13.404954 52.520008)'),
|
||||
create(:point, user:, import:, timestamp: timestamp_base + 70.minutes,
|
||||
city: 'Berlin', country_name: 'Germany',
|
||||
lonlat: 'POINT(13.404954 52.520008)'),
|
||||
|
||||
# Prague: 20 minutes (should be excluded)
|
||||
create(:point, user:, import:, timestamp: timestamp_base + 100.minutes,
|
||||
city: 'Prague', country_name: 'Czech Republic',
|
||||
lonlat: 'POINT(14.4378 50.0755)'),
|
||||
create(:point, user:, import:, timestamp: timestamp_base + 120.minutes,
|
||||
city: 'Prague', country_name: 'Czech Republic',
|
||||
lonlat: 'POINT(14.4378 50.0755)'),
|
||||
|
||||
# Vienna: 90 minutes (should be included)
|
||||
create(:point, user:, import:, timestamp: timestamp_base + 150.minutes,
|
||||
city: 'Vienna', country_name: 'Austria',
|
||||
lonlat: 'POINT(16.3738 48.2082)'),
|
||||
create(:point, user:, import:, timestamp: timestamp_base + 240.minutes,
|
||||
city: 'Vienna', country_name: 'Austria',
|
||||
lonlat: 'POINT(16.3738 48.2082)')
|
||||
]
|
||||
end
|
||||
|
||||
it 'only includes cities where user spent >= MIN_MINUTES_SPENT_IN_CITY' do
|
||||
calculate_stats
|
||||
|
||||
stat = user.stats.last
|
||||
expect(stat.toponyms).not_to be_empty
|
||||
|
||||
# Get all cities from all countries
|
||||
all_cities = stat.toponyms.flat_map { |t| t['cities'].map { |c| c['city'] } }
|
||||
|
||||
# Berlin and Vienna should be included
|
||||
expect(all_cities).to include('Berlin', 'Vienna')
|
||||
|
||||
# Prague should NOT be included
|
||||
expect(all_cities).not_to include('Prague')
|
||||
|
||||
# Should have exactly 2 cities
|
||||
expect(all_cities.size).to eq(2)
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
@@ -55,8 +55,8 @@ describe 'Stats API', type: :request do
|
||||
]
|
||||
|
||||
let!(:user) { create(:user) }
|
||||
let!(:stats_in_2020) { create_list(:stat, 12, year: 2020, user:) }
|
||||
let!(:stats_in_2021) { create_list(:stat, 12, year: 2021, user:) }
|
||||
let!(:stats_in_2020) { (1..12).map { |month| create(:stat, year: 2020, month:, user:) } }
|
||||
let!(:stats_in_2021) { (1..12).map { |month| create(:stat, year: 2021, month:, user:) } }
|
||||
let!(:points_in_2020) do
|
||||
(1..85).map do |i|
|
||||
create(:point, :with_geodata, :reverse_geocoded, timestamp: Time.zone.local(2020, 1, 1).to_i + i.hours,
|
||||
|
||||
Reference in New Issue
Block a user