Jira Legacy | ||||||
---|---|---|---|---|---|---|
|
Currently Many of our API keys that cost money based on usage are not domain protected. They probably should be
Area / scope to test | Front end | Back end | Model | Notes |
---|---|---|---|---|
Requirement specific constraints |
Exported doc shows “Depth_cm”, not “Depth”
N/A
| ||||
Role based constraints |
FW can’t see upload modal or even “+” symbol on farm map (not strictly in scope of this ticket, but easily verifiable)
FW shouldn’t be able to download template
FW shouldn’t be able to upload template
Does role determine what a user can see or do? Is this enforced uniformly across the front end and back end? | ||||
User preferences constraints |
Validation and logic should be updated to ensure uploads with different languages create sensors as expected
N/A
Is this impacted by user or farm preferences such as language, system of measure, certification status? | |||
Numerical input constraints |
Validation must verify via in-line error that 0 <= Depth_cm <= 1000
Either decimal or integer
Either decimal or integer
Validation must verify via in-line error that 0 <= Depth_cm <= 1000
Either decimal or integer
Do we appropriately handle negative, very small, very large, or 0 as inputs? | |||
Text input constraints |
Validation must verify via in-line error that 0 <= Depth_cm <= 1000
Either decimal or integer
Validation must verify via in-line error that 0 <= Depth_cm <= 1000
Either decimal or integer
Validation must verify via in-line error that 0 <= Depth_cm <= 1000
Either decimal or integer
Do we appropriately handle blank, very small, and very large inputs? Is there a strict format (such as email) that must be followed? | |||
Date based constraints |
Validation must verify via in-line error that 0 <= Depth_cm <= 1000
Validation must verify via in-line error that 0 <= Depth_cm <= 1000
Either decimal or integer
Validation must verify via in-line error that 0 <= Depth_cm <= 1000
Either decimal or integer
Are there logical restrictions on what dates can be input? Should a use be able to complete something in the future for example. | |||
Date based assumptions |
N/A
N/A
Are we making valid assumptions about what dates should be allowed? | ||||
Timezone driven interactions |
N/A
N/A
If timezones play a role in the data, are they being displayed or converted appropriately? | |||
Interaction / transitioning UI based constraints |
Presentation of in-line error for invalid “Dept_cm” values displaying “There were some issues with your upload. Click here to view them.” “Upload” should stay disabled
For csv that passes validation, “Upload” button should become active
Post-upload of 1+ sensors, map should resize to show all sensors and all other visible location
N/A
N/A
Is the UI transitioning appropriately? Is the API providing da | |||
Flow based constraints |
When clicking “<“ from modal, expand drawer
No expectation of upload being persisted when going back
After uploading an invalid csv, user can upload a valid one directly and flush the previous one
N/A
N/A
Is state being preserved appropriately in a flow? If I go back and then forth, is it maintained? Is state invalidated when it should be? | |||
Synchronous / asynchronous constraints |
If upload takes more than 3 seconds, transition to asynchronous flow and prompt user with modal
Should receive follow-up notification
If upload takes less than 3 seconds, sensor(s) should appear on map after being successful persistance
Should see success banner
When notification is received, data should be available in the database
Is the interaction synchronous, asynchronous, or does it support both? Can you simulate both if so? | |||
Time-out / low bandwidth constraints |
For lost connection, …
N/A
N/A
Does the feature fail gracefully under no bandwidth / low bandwidth environments? | |||
Data transformation correctness |
After a successful upload, viewing any of the uploaded sensors should show the depth_cm on the sensor detail page appropriately transformed from cm to meters
After successful upload, verify performing a GET on the uploaded sensor(s) returns depth appropriately transformed
Should be uploaded in cm and stored in m (e.g. divided by 100)
Are values appropriately updated when units change? Is it WYSIWYG? | |||
Outcome correctness |
For success: success banner displays
Correctly for a single sensor
Correctly for 2+ sensors
Validation error displays in-line for bad inputs
For poor connectivity, appropriately goes to async
Errors:
Code varies by reason
Verify sensor created on sensor table
When inputting known inputs with expected outputs - do you get the results you expect? Have you tested several “cases” of this? | |||
Switching farms |
Verify uploaded sensors only exist at farm where they were uploaded
N/A
Does this feature respond well to switching farms (and returning)? | ||||
Notification constraints |
N/A
N/A
Success or failure based on validation (including a valid Depth_cm value)
Should a notification be marshalled based on this action? |
Cascading effects |
Upon successful upload, reframe the map to appropriately show all uploaded sensors
Upon successful upload, clicking on an uploaded sensor displays the correct details
Upon successful upload, check to see what sensors are now:
RegisteredUnregistered
Receiving data
If a sensor already exists and is registered at another farm, it should fail to be added to this farm
If a sensor has previously been unregistered, it should be registered to this farm
If there’s a data feed associated with a sensor it should start receiving data
Are there logical places | |||
Integration constrains |
Upon successful upload, check to see what sensors are now:
Registered via the Ensemble API
Unregistered via the Ensemble API
Receiving data from the Ensemble API
If a sensor already exists and is registered at another farm, it should fail to be added to this farm
If a sensor has previously been unregistered, it should be registered to this farm
If there’s a data feed associated with a sensor it should start receiving data
Do we need to ensure state is consistent between LiteFarm and the external service? What failure cases do we need to handle? How do we report back the outcome to the user or external service? | |||
Concurrency |
How do changes made to records affect other users on the farm? e.g. What happens when a record is soft deleted while another user is viewing said record? |