-
-
Notifications
You must be signed in to change notification settings - Fork 70
refactor: move assessment event fields to parent class #2240
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #2240 +/- ##
============================================
- Coverage 16.87% 16.68% -0.20%
- Complexity 423 424 +1
============================================
Files 245 246 +1
Lines 6838 6922 +84
Branches 793 804 +11
============================================
+ Hits 1154 1155 +1
- Misses 5630 5713 +83
Partials 54 54 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
## Walkthrough
The changes update dependency versions in the build system and modify the analytics entity classes by adding new fields to `AssessmentEvent` while removing similar fields from `LetterSoundAssessmentEvent` and `WordAssessmentEvent`. The example JSON in the Javadoc was updated, `@NotNull` annotations were added to two fields in `LetterSoundAssessmentEvent`, and the SQL schema column order was adjusted for related tables. The CSV export column order was also modified, and a migration script enforces non-null constraints on two columns. A new CSV extraction method and a new import scheduler service for `LetterSoundAssessmentEvent` were added, along with DAO interface and implementation changes.
## Changes
| File(s) | Change Summary |
|-------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------|
| pom-dependency-tree.txt | Updated dependency versions for `ai.elimu:webapp`, `ai.elimu:model`, and `com.google.code.gson`. |
| src/main/java/ai/elimu/entity/analytics/AssessmentEvent.java | Added `masteryScore` (Float) and `timeSpentMs` (Long) fields; updated Javadoc example JSON. |
| src/main/java/ai/elimu/entity/analytics/LetterSoundAssessmentEvent.java | Removed `masteryScore` and `timeSpentMs` fields; added `@NotNull` to `letterSoundLetters` and `letterSoundSounds`. |
| src/main/java/ai/elimu/entity/analytics/WordAssessmentEvent.java | Removed `masteryScore` and `timeSpentMs` fields. |
| src/main/resources/META-INF/jpa-schema-export.sql | Reordered columns in `LetterSoundAssessmentEvent` and `WordAssessmentEvent` tables. |
| src/main/java/ai/elimu/web/analytics/students/LetterSoundAssessmentEventsCsvExportController.java | Reordered CSV export columns related to letter sound fields. |
| src/main/java/ai/elimu/web/analytics/students/WordAssessmentEventsCsvExportController.java | Reordered CSV export columns related to word fields. |
| src/main/resources/db/migration/2006047.sql | Migration script to enforce `NOT NULL` on `letterSoundLetters` and `letterSoundSounds` columns. |
| src/main/java/ai/elimu/util/csv/CsvAnalyticsExtractionHelper.java | Added method to extract `LetterSoundAssessmentEvent` from CSV; reordered extraction of `wordText` and `wordId`. |
| src/main/java/ai/elimu/dao/LetterSoundAssessmentEventDao.java | Added `read` method to fetch single `LetterSoundAssessmentEvent` by timestamp, androidId, and packageName. |
| src/main/java/ai/elimu/dao/jpa/LetterSoundAssessmentEventDaoJpa.java | Implemented `read` method with JPA query and exception handling. |
| src/main/java/ai/elimu/tasks/analytics/LetterSoundAssessmentEventImportScheduler.java | New scheduled service to import `LetterSoundAssessmentEvent` CSV files from disk into the database, with student creation and duplicate checks. |
## Sequence Diagram(s)
```mermaid
sequenceDiagram
participant Scheduler as ImportScheduler
participant FileSystem as File System
participant CsvHelper as CsvAnalyticsExtractionHelper
participant Dao as LetterSoundAssessmentEventDao
participant StudentDao as StudentDao
participant Database as Database
Scheduler->>FileSystem: Scan analytics CSV directories
FileSystem-->>Scheduler: List CSV files
Scheduler->>CsvHelper: extractLetterSoundAssessmentEvents(csvFile)
CsvHelper-->>Scheduler: List of events
loop For each event
Scheduler->>Dao: read(event.timestamp, event.androidId, event.packageName)
alt Event not found
Scheduler->>StudentDao: read(event.androidId)
alt Student not found
Scheduler->>Database: save new Student
end
Scheduler->>Database: save new LetterSoundAssessmentEvent
end
endPossibly related PRs
Suggested reviewers
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
pom.xmlis excluded by!**/*.xml
📒 Files selected for processing (5)
pom-dependency-tree.txt(1 hunks)src/main/java/ai/elimu/entity/analytics/AssessmentEvent.java(1 hunks)src/main/java/ai/elimu/entity/analytics/LetterSoundAssessmentEvent.java(0 hunks)src/main/java/ai/elimu/entity/analytics/WordAssessmentEvent.java(0 hunks)src/main/resources/META-INF/jpa-schema-export.sql(2 hunks)
💤 Files with no reviewable changes (2)
- src/main/java/ai/elimu/entity/analytics/WordAssessmentEvent.java
- src/main/java/ai/elimu/entity/analytics/LetterSoundAssessmentEvent.java
⏰ Context from checks skipped due to timeout of 90000ms (12)
- GitHub Check: test_rest
- GitHub Check: test_rest
- GitHub Check: test_rest
- GitHub Check: build (windows-latest, 17)
- GitHub Check: build (macos-latest, 17)
- GitHub Check: build (windows-latest, 21)
- GitHub Check: build (macos-latest, 21)
- GitHub Check: build (ubuntu-latest, 21)
- GitHub Check: build (ubuntu-latest, 17)
- GitHub Check: test_rest
- GitHub Check: test_rest
- GitHub Check: test_rest
🔇 Additional comments (5)
pom-dependency-tree.txt (1)
1-4:✅ Verification successful
Verify dependency versions for security and compatibility.
The dependency updates include version increments for the model library and supporting dependencies. Ensure these versions are secure and compatible with the current system.
🏁 Script executed:
#!/bin/bash # Description: Check for security advisories for updated dependencies echo "Checking Gson version 2.13.1 for security advisories..." gh api graphql -f query=' { securityVulnerabilities(first: 5, ecosystem: MAVEN, package: "com.google.code.gson:gson") { nodes { advisory { summary severity publishedAt } vulnerableVersionRange firstPatchedVersion { identifier } } } }' echo "Checking latest available versions..." curl -s "https://search.maven.org/solrsearch/select?q=g:com.google.code.gson+AND+a:gson&core=gav&rows=1&wt=json" | jq '.response.docs[0].latestVersion'Length of output: 927
Dependencies Verified: No Security Issues Detected
- com.google.code.gson:gson 2.13.1 is outside the known vulnerable range (< 2.8.9); no new advisories.
- com.google.errorprone:error_prone_annotations 2.38.0 has no reported vulnerabilities.
- Internal version bumps for ai.elimu:webapp 2.6.47-SNAPSHOT and ai.elimu:model 2.0.111 introduce no security concerns.
All updates are secure and compatible—no further action needed.
src/main/java/ai/elimu/entity/analytics/AssessmentEvent.java (2)
52-57: LGTM! Well-documented time tracking field.The
timeSpentMsfield implementation is correct with clear documentation and appropriate data type choice.
63-63:❓ Verification inconclusive
Verify consistency of JSON structure change across the codebase.
The example JSON was updated from
'word_id_selected': 2to'word_id_selected': [2], suggesting a change from single value to array format. Ensure this change is consistently applied throughout the system.
🏁 Script executed:
#!/bin/bash # Description: Search for usage patterns of word_id_selected to verify consistency echo "Searching for word_id_selected usage patterns..." rg -A 3 -B 3 "word_id_selected" --type java echo "Searching for similar JSON patterns in comments and strings..." rg -A 2 -B 2 "'word_id_selected':\s*[0-9]" --type java echo "Checking for array vs single value patterns in assessment data..." rg -A 5 -B 5 "additionalData.*word.*selected" --type javaLength of output: 1112
Verify
word_id_selectedArray Format Across the SystemI only found the updated example in
src/main/java/ai/elimu/entity/analytics/AssessmentEvent.java. To ensure no other code, tests or docs still expect a single integer:Please confirm that:
- JSON serializers/deserializers in the back-end handle an array.
- Unit/integration tests reference an array, not a scalar.
- API docs (Swagger/OpenAPI) and sample payloads use the array form.
- Front-end code or clients sending/receiving this field expect an array.
🏁 Script executed:
#!/bin/bash # Search all file types for any remaining uses of word_id_selected rg -n "word_id_selected" .Length of output: 154
word_id_selectedArray Usage Verified, Manual Review RecommendedA repo-wide search for
word_id_selectedreturned only the example insrc/main/java/ai/elimu/entity/analytics/AssessmentEvent.java. No other direct references were found.Please ensure:
- JSON serializers/deserializers properly handle an array (not a single value).
- Unit and integration tests reference the array format.
- API documentation (Swagger/OpenAPI) and sample payloads show the array.
- Any front-end code or external clients sending/receiving this field expect an array.
src/main/resources/META-INF/jpa-schema-export.sql (2)
311-324: Schema changes align with entity refactoring.The column reordering for
masteryScoreandtimeSpentMsin theLetterSoundAssessmentEventtable correctly reflects the movement of these fields to the parentAssessmentEventclass.
609-621: Consistent schema structure for WordAssessmentEvent.The column positioning matches the changes made to
LetterSoundAssessmentEvent, maintaining consistency across assessment event tables.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 4
🧹 Nitpick comments (1)
src/main/java/ai/elimu/tasks/analytics/LetterSoundAssessmentEventImportScheduler.java (1)
52-53: Consider performance impact of synchronized method.The
synchronizedmodifier on the entire method could create a performance bottleneck, especially if CSV processing takes significant time. Consider using more granular locking or file-based locking to prevent concurrent processing of the same files while allowing parallel processing of different directories.Consider implementing file-based locking or processing different Android IDs in parallel:
-@Scheduled(cron = "00 15 * * * *") // 15 minutes past every hour -public synchronized void execute() { +@Scheduled(cron = "00 15 * * * *") // 15 minutes past every hour +public void execute() {And add file-level locking within the CSV processing loop.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (4)
src/main/java/ai/elimu/dao/LetterSoundAssessmentEventDao.java(1 hunks)src/main/java/ai/elimu/dao/jpa/LetterSoundAssessmentEventDaoJpa.java(1 hunks)src/main/java/ai/elimu/tasks/analytics/LetterSoundAssessmentEventImportScheduler.java(1 hunks)src/main/java/ai/elimu/util/csv/CsvAnalyticsExtractionHelper.java(3 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
- src/main/java/ai/elimu/util/csv/CsvAnalyticsExtractionHelper.java
⏰ Context from checks skipped due to timeout of 90000ms (11)
- GitHub Check: build (windows-latest, 21)
- GitHub Check: build (windows-latest, 17)
- GitHub Check: build (macos-latest, 17)
- GitHub Check: test_rest
- GitHub Check: test_rest
- GitHub Check: build (ubuntu-latest, 17)
- GitHub Check: build (ubuntu-latest, 21)
- GitHub Check: test_rest
- GitHub Check: test_rest
- GitHub Check: test_rest
- GitHub Check: test_rest
🔇 Additional comments (2)
src/main/java/ai/elimu/dao/LetterSoundAssessmentEventDao.java (1)
3-3: LGTM! Clean DAO interface addition.The new
readmethod follows standard DAO patterns and provides a clean interface for finding events by composite key (timestamp, androidId, packageName).Also applies to: 12-12
src/main/java/ai/elimu/dao/jpa/LetterSoundAssessmentEventDaoJpa.java (1)
14-31: LGTM! Solid JPA implementation with proper exception handling.The implementation correctly uses parameterized queries and handles the
NoResultExceptionappropriately by logging and returning null. The query structure and parameter binding follow JPA best practices.
| log.info("csvFile: " + csvFile); | ||
|
|
||
| // Convert from CSV to Java | ||
| List<LetterSoundAssessmentEvent> events = CsvAnalyticsExtractionHelper.extractLetterSoundAssessmentEvents(csvFile); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Verification agent
🧩 Analysis chain
Verify CSV extraction helper bug fix.
Based on the relevant code snippet, there's a bug in the CSV extraction helper where letterSoundSounds is incorrectly assigned to letterSoundLetters.
🏁 Script executed:
#!/bin/bash
# Description: Check if the CSV extraction helper has the setter bug
# Expected: Find incorrect assignment of letterSoundSounds to letterSoundLetters
rg -A 2 -B 2 "setLetterSoundLetters.*letterSoundSounds" src/main/java/ai/elimu/util/csv/CsvAnalyticsExtractionHelper.javaLength of output: 376
Fix CSV helper setter bug
The CsvAnalyticsExtractionHelper is incorrectly assigning the “sounds” value to the “letters” field. In src/main/java/ai/elimu/util/csv/CsvAnalyticsExtractionHelper.java, update the setter call:
@@ -XXX,3 +XXX,3 @@
- letterSoundAssessmentEvent.setLetterSoundLetters(letterSoundSounds);
+ letterSoundAssessmentEvent.setLetterSoundSounds(letterSoundSounds);This ensures that letterSoundSounds is stored in the proper field.
🤖 Prompt for AI Agents
In src/main/java/ai/elimu/util/csv/CsvAnalyticsExtractionHelper.java around the
setter calls for letterSoundLetters and letterSoundSounds, there is a bug where
letterSoundSounds is incorrectly assigned to letterSoundLetters. To fix this,
locate the setter call that assigns letterSoundSounds and ensure it is set using
the correct setter method setLetterSoundSounds instead of setLetterSoundLetters.
This will correctly store the sounds value in the appropriate field.
| // Convert from CSV to Java | ||
| List<LetterSoundAssessmentEvent> events = CsvAnalyticsExtractionHelper.extractLetterSoundAssessmentEvents(csvFile); | ||
| log.info("events.size(): " + events.size()); | ||
|
|
||
| // Store in database | ||
| for (LetterSoundAssessmentEvent event : events) { | ||
| // Check if the event has already been stored in the database | ||
| LetterSoundAssessmentEvent existingLetterSoundAssessmentEvent = letterSoundAssessmentEventDao.read(event.getTimestamp(), event.getAndroidId(), event.getPackageName()); | ||
| if (existingLetterSoundAssessmentEvent != null) { | ||
| log.warn("The event has already been stored in the database. Skipping data import."); | ||
| continue; | ||
| } | ||
|
|
||
| // Generate Student ID | ||
| Student existingStudent = studentDao.read(event.getAndroidId()); | ||
| if (existingStudent == null) { | ||
| Student student = new Student(); | ||
| student.setAndroidId(event.getAndroidId()); | ||
| studentDao.create(student); | ||
| log.info("Stored Student in database with ID " + student.getId()); | ||
| } | ||
|
|
||
| // Store the event in the database | ||
| letterSoundAssessmentEventDao.create(event); | ||
| log.info("Stored event in database with ID " + event.getId()); | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add transaction management for data consistency.
The import process lacks transaction boundaries, which could lead to inconsistent state if the process fails partway through. Student creation and event insertion should be wrapped in transactions to ensure atomicity.
Add @Transactional annotation and proper transaction boundaries:
+import org.springframework.transaction.annotation.Transactional;
@Service
@RequiredArgsConstructor
@Slf4j
public class LetterSoundAssessmentEventImportScheduler {
@Scheduled(cron = "00 15 * * * *") // 15 minutes past every hour
+ @Transactional
public void execute() {Consider wrapping the inner processing loop in separate transactions to limit rollback scope.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| // Convert from CSV to Java | |
| List<LetterSoundAssessmentEvent> events = CsvAnalyticsExtractionHelper.extractLetterSoundAssessmentEvents(csvFile); | |
| log.info("events.size(): " + events.size()); | |
| // Store in database | |
| for (LetterSoundAssessmentEvent event : events) { | |
| // Check if the event has already been stored in the database | |
| LetterSoundAssessmentEvent existingLetterSoundAssessmentEvent = letterSoundAssessmentEventDao.read(event.getTimestamp(), event.getAndroidId(), event.getPackageName()); | |
| if (existingLetterSoundAssessmentEvent != null) { | |
| log.warn("The event has already been stored in the database. Skipping data import."); | |
| continue; | |
| } | |
| // Generate Student ID | |
| Student existingStudent = studentDao.read(event.getAndroidId()); | |
| if (existingStudent == null) { | |
| Student student = new Student(); | |
| student.setAndroidId(event.getAndroidId()); | |
| studentDao.create(student); | |
| log.info("Stored Student in database with ID " + student.getId()); | |
| } | |
| // Store the event in the database | |
| letterSoundAssessmentEventDao.create(event); | |
| log.info("Stored event in database with ID " + event.getId()); | |
| } | |
| import org.springframework.transaction.annotation.Transactional; | |
| @Service | |
| @RequiredArgsConstructor | |
| @Slf4j | |
| public class LetterSoundAssessmentEventImportScheduler { | |
| @Scheduled(cron = "00 15 * * * *") // 15 minutes past every hour | |
| @Transactional | |
| public void execute() { | |
| // Convert from CSV to Java | |
| List<LetterSoundAssessmentEvent> events = CsvAnalyticsExtractionHelper.extractLetterSoundAssessmentEvents(csvFile); | |
| log.info("events.size(): " + events.size()); | |
| // Store in database | |
| for (LetterSoundAssessmentEvent event : events) { | |
| // Check if the event has already been stored in the database | |
| LetterSoundAssessmentEvent existingLetterSoundAssessmentEvent = | |
| letterSoundAssessmentEventDao.read( | |
| event.getTimestamp(), event.getAndroidId(), event.getPackageName()); | |
| if (existingLetterSoundAssessmentEvent != null) { | |
| log.warn("The event has already been stored in the database. Skipping data import."); | |
| continue; | |
| } | |
| // Generate Student ID | |
| Student existingStudent = studentDao.read(event.getAndroidId()); | |
| if (existingStudent == null) { | |
| Student student = new Student(); | |
| student.setAndroidId(event.getAndroidId()); | |
| studentDao.create(student); | |
| log.info("Stored Student in database with ID " + student.getId()); | |
| } | |
| // Store the event in the database | |
| letterSoundAssessmentEventDao.create(event); | |
| log.info("Stored event in database with ID " + event.getId()); | |
| } | |
| } | |
| } |
🤖 Prompt for AI Agents
In
src/main/java/ai/elimu/tasks/analytics/LetterSoundAssessmentEventImportScheduler.java
around lines 74 to 99, the import process lacks transaction management, risking
inconsistent database state if failures occur mid-process. To fix this, add the
@Transactional annotation to the method handling this import or explicitly
manage transactions around the student creation and event insertion logic.
Consider wrapping each iteration of the loop in its own transaction to limit
rollback scope and ensure atomicity for each event processed.
🛠️ Refactor suggestion
Add error handling for CSV processing and database operations.
The code lacks error handling around CSV extraction and database operations. If CSV parsing fails or database operations throw exceptions, the scheduler could fail silently or crash.
Wrap file processing in try-catch blocks:
for (File csvFile : letterSoundAssessmentEventsDir.listFiles()) {
log.info("csvFile: " + csvFile);
-
- // Convert from CSV to Java
- List<LetterSoundAssessmentEvent> events = CsvAnalyticsExtractionHelper.extractLetterSoundAssessmentEvents(csvFile);
- log.info("events.size(): " + events.size());
-
- // Store in database
- for (LetterSoundAssessmentEvent event : events) {
+
+ try {
+ // Convert from CSV to Java
+ List<LetterSoundAssessmentEvent> events = CsvAnalyticsExtractionHelper.extractLetterSoundAssessmentEvents(csvFile);
+ log.info("events.size(): " + events.size());
+
+ // Store in database
+ for (LetterSoundAssessmentEvent event : events) {
// ... existing processing logic
+ }
+ } catch (Exception e) {
+ log.error("Failed to process CSV file: " + csvFile.getName(), e);
+ // Consider moving file to error directory or marking as failed
}
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| // Convert from CSV to Java | |
| List<LetterSoundAssessmentEvent> events = CsvAnalyticsExtractionHelper.extractLetterSoundAssessmentEvents(csvFile); | |
| log.info("events.size(): " + events.size()); | |
| // Store in database | |
| for (LetterSoundAssessmentEvent event : events) { | |
| // Check if the event has already been stored in the database | |
| LetterSoundAssessmentEvent existingLetterSoundAssessmentEvent = letterSoundAssessmentEventDao.read(event.getTimestamp(), event.getAndroidId(), event.getPackageName()); | |
| if (existingLetterSoundAssessmentEvent != null) { | |
| log.warn("The event has already been stored in the database. Skipping data import."); | |
| continue; | |
| } | |
| // Generate Student ID | |
| Student existingStudent = studentDao.read(event.getAndroidId()); | |
| if (existingStudent == null) { | |
| Student student = new Student(); | |
| student.setAndroidId(event.getAndroidId()); | |
| studentDao.create(student); | |
| log.info("Stored Student in database with ID " + student.getId()); | |
| } | |
| // Store the event in the database | |
| letterSoundAssessmentEventDao.create(event); | |
| log.info("Stored event in database with ID " + event.getId()); | |
| } | |
| for (File csvFile : letterSoundAssessmentEventsDir.listFiles()) { | |
| log.info("csvFile: " + csvFile); | |
| try { | |
| // Convert from CSV to Java | |
| List<LetterSoundAssessmentEvent> events = CsvAnalyticsExtractionHelper.extractLetterSoundAssessmentEvents(csvFile); | |
| log.info("events.size(): " + events.size()); | |
| // Store in database | |
| for (LetterSoundAssessmentEvent event : events) { | |
| // Check if the event has already been stored in the database | |
| LetterSoundAssessmentEvent existingLetterSoundAssessmentEvent = letterSoundAssessmentEventDao.read( | |
| event.getTimestamp(), | |
| event.getAndroidId(), | |
| event.getPackageName() | |
| ); | |
| if (existingLetterSoundAssessmentEvent != null) { | |
| log.warn("The event has already been stored in the database. Skipping data import."); | |
| continue; | |
| } | |
| // Generate Student ID | |
| Student existingStudent = studentDao.read(event.getAndroidId()); | |
| if (existingStudent == null) { | |
| Student student = new Student(); | |
| student.setAndroidId(event.getAndroidId()); | |
| studentDao.create(student); | |
| log.info("Stored Student in database with ID " + student.getId()); | |
| } | |
| // Store the event in the database | |
| letterSoundAssessmentEventDao.create(event); | |
| log.info("Stored event in database with ID " + event.getId()); | |
| } | |
| } catch (Exception e) { | |
| log.error("Failed to process CSV file: " + csvFile.getName(), e); | |
| // Consider moving the file to an error directory or marking it as failed | |
| } | |
| } |
🤖 Prompt for AI Agents
In
src/main/java/ai/elimu/tasks/analytics/LetterSoundAssessmentEventImportScheduler.java
around lines 74 to 99, add try-catch blocks around the CSV extraction and
database operations to handle potential exceptions. Wrap the call to
CsvAnalyticsExtractionHelper.extractLetterSoundAssessmentEvents and the
subsequent database read and create calls in try-catch blocks to catch and log
exceptions, preventing the scheduler from failing silently or crashing. Ensure
that errors are logged with sufficient detail for debugging.
| for (File analyticsDirFile : analyticsDir.listFiles()) { | ||
| if (analyticsDirFile.getName().startsWith("android-id-")) { | ||
| File androidIdDir = new File(analyticsDir, analyticsDirFile.getName()); | ||
| for (File androidIdDirFile : androidIdDir.listFiles()) { | ||
| if (androidIdDirFile.getName().startsWith("version-code-")) { | ||
| File versionCodeDir = new File(androidIdDir, androidIdDirFile.getName()); | ||
| for (File versionCodeDirFile : versionCodeDir.listFiles()) { | ||
| if (versionCodeDirFile.getName().equals("letter-sound-assessment-events")) { | ||
| File letterSoundAssessmentEventsDir = new File(versionCodeDir, versionCodeDirFile.getName()); | ||
| for (File csvFile : letterSoundAssessmentEventsDir.listFiles()) { | ||
| log.info("csvFile: " + csvFile); | ||
|
|
||
| // Convert from CSV to Java | ||
| List<LetterSoundAssessmentEvent> events = CsvAnalyticsExtractionHelper.extractLetterSoundAssessmentEvents(csvFile); | ||
| log.info("events.size(): " + events.size()); | ||
|
|
||
| // Store in database | ||
| for (LetterSoundAssessmentEvent event : events) { | ||
| // Check if the event has already been stored in the database | ||
| LetterSoundAssessmentEvent existingLetterSoundAssessmentEvent = letterSoundAssessmentEventDao.read(event.getTimestamp(), event.getAndroidId(), event.getPackageName()); | ||
| if (existingLetterSoundAssessmentEvent != null) { | ||
| log.warn("The event has already been stored in the database. Skipping data import."); | ||
| continue; | ||
| } | ||
|
|
||
| // Generate Student ID | ||
| Student existingStudent = studentDao.read(event.getAndroidId()); | ||
| if (existingStudent == null) { | ||
| Student student = new Student(); | ||
| student.setAndroidId(event.getAndroidId()); | ||
| studentDao.create(student); | ||
| log.info("Stored Student in database with ID " + student.getId()); | ||
| } | ||
|
|
||
| // Store the event in the database | ||
| letterSoundAssessmentEventDao.create(event); | ||
| log.info("Stored event in database with ID " + event.getId()); | ||
| } | ||
| } | ||
| } | ||
| } | ||
| } | ||
| } | ||
| } | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Critical: Add null checks and error handling for directory traversal.
The directory traversal logic is vulnerable to NullPointerException if any directory returns null from listFiles(). This can happen if the directory doesn't exist, lacks read permissions, or encounters I/O errors.
for (File analyticsDirFile : analyticsDir.listFiles()) {
+ if (analyticsDirFile == null) continue;
if (analyticsDirFile.getName().startsWith("android-id-")) {
File androidIdDir = new File(analyticsDir, analyticsDirFile.getName());
- for (File androidIdDirFile : androidIdDir.listFiles()) {
+ File[] androidIdFiles = androidIdDir.listFiles();
+ if (androidIdFiles == null) continue;
+ for (File androidIdDirFile : androidIdFiles) {
if (androidIdDirFile.getName().startsWith("version-code-")) {
File versionCodeDir = new File(androidIdDir, androidIdDirFile.getName());
- for (File versionCodeDirFile : versionCodeDir.listFiles()) {
+ File[] versionCodeFiles = versionCodeDir.listFiles();
+ if (versionCodeFiles == null) continue;
+ for (File versionCodeDirFile : versionCodeFiles) {
if (versionCodeDirFile.getName().equals("letter-sound-assessment-events")) {
File letterSoundAssessmentEventsDir = new File(versionCodeDir, versionCodeDirFile.getName());
- for (File csvFile : letterSoundAssessmentEventsDir.listFiles()) {
+ File[] csvFiles = letterSoundAssessmentEventsDir.listFiles();
+ if (csvFiles == null) continue;
+ for (File csvFile : csvFiles) {📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| for (File analyticsDirFile : analyticsDir.listFiles()) { | |
| if (analyticsDirFile.getName().startsWith("android-id-")) { | |
| File androidIdDir = new File(analyticsDir, analyticsDirFile.getName()); | |
| for (File androidIdDirFile : androidIdDir.listFiles()) { | |
| if (androidIdDirFile.getName().startsWith("version-code-")) { | |
| File versionCodeDir = new File(androidIdDir, androidIdDirFile.getName()); | |
| for (File versionCodeDirFile : versionCodeDir.listFiles()) { | |
| if (versionCodeDirFile.getName().equals("letter-sound-assessment-events")) { | |
| File letterSoundAssessmentEventsDir = new File(versionCodeDir, versionCodeDirFile.getName()); | |
| for (File csvFile : letterSoundAssessmentEventsDir.listFiles()) { | |
| log.info("csvFile: " + csvFile); | |
| // Convert from CSV to Java | |
| List<LetterSoundAssessmentEvent> events = CsvAnalyticsExtractionHelper.extractLetterSoundAssessmentEvents(csvFile); | |
| log.info("events.size(): " + events.size()); | |
| // Store in database | |
| for (LetterSoundAssessmentEvent event : events) { | |
| // Check if the event has already been stored in the database | |
| LetterSoundAssessmentEvent existingLetterSoundAssessmentEvent = letterSoundAssessmentEventDao.read(event.getTimestamp(), event.getAndroidId(), event.getPackageName()); | |
| if (existingLetterSoundAssessmentEvent != null) { | |
| log.warn("The event has already been stored in the database. Skipping data import."); | |
| continue; | |
| } | |
| // Generate Student ID | |
| Student existingStudent = studentDao.read(event.getAndroidId()); | |
| if (existingStudent == null) { | |
| Student student = new Student(); | |
| student.setAndroidId(event.getAndroidId()); | |
| studentDao.create(student); | |
| log.info("Stored Student in database with ID " + student.getId()); | |
| } | |
| // Store the event in the database | |
| letterSoundAssessmentEventDao.create(event); | |
| log.info("Stored event in database with ID " + event.getId()); | |
| } | |
| } | |
| } | |
| } | |
| } | |
| } | |
| } | |
| } | |
| for (File analyticsDirFile : analyticsDir.listFiles()) { | |
| if (analyticsDirFile == null) { | |
| continue; | |
| } | |
| if (analyticsDirFile.getName().startsWith("android-id-")) { | |
| File androidIdDir = new File(analyticsDir, analyticsDirFile.getName()); | |
| File[] androidIdFiles = androidIdDir.listFiles(); | |
| if (androidIdFiles == null) { | |
| continue; | |
| } | |
| for (File androidIdDirFile : androidIdFiles) { | |
| if (androidIdDirFile.getName().startsWith("version-code-")) { | |
| File versionCodeDir = new File(androidIdDir, androidIdDirFile.getName()); | |
| File[] versionCodeFiles = versionCodeDir.listFiles(); | |
| if (versionCodeFiles == null) { | |
| continue; | |
| } | |
| for (File versionCodeDirFile : versionCodeFiles) { | |
| if (versionCodeDirFile.getName().equals("letter-sound-assessment-events")) { | |
| File letterSoundAssessmentEventsDir = new File(versionCodeDir, versionCodeDirFile.getName()); | |
| File[] csvFiles = letterSoundAssessmentEventsDir.listFiles(); | |
| if (csvFiles == null) { | |
| continue; | |
| } | |
| for (File csvFile : csvFiles) { | |
| log.info("csvFile: " + csvFile); | |
| // Convert from CSV to Java | |
| List<LetterSoundAssessmentEvent> events = | |
| CsvAnalyticsExtractionHelper.extractLetterSoundAssessmentEvents(csvFile); | |
| log.info("events.size(): " + events.size()); | |
| // Store in database | |
| for (LetterSoundAssessmentEvent event : events) { | |
| // ... | |
| } | |
| } | |
| } | |
| } | |
| } | |
| } | |
| } | |
| } |
🤖 Prompt for AI Agents
In
src/main/java/ai/elimu/tasks/analytics/LetterSoundAssessmentEventImportScheduler.java
between lines 62 and 106, add null checks after each call to listFiles() to
ensure the returned array is not null before iterating. If listFiles() returns
null, skip processing that directory to prevent NullPointerException.
Additionally, consider adding error handling or logging to capture cases where
directories are missing or inaccessible.
https://github.com/elimu-ai/model/releases/tag/model-2.0.111
Issue Number
Purpose
Technical Details
Testing Instructions
Screenshots
Format Checks
Note
Files in PRs are automatically checked for format violations with
mvn spotless:check.If this PR contains files with format violations, run
mvn spotless:applyto fix them.