As part of the Data Standards review, a number of organisations will be required to amend previously uploaded historic data, to comply with the current standards.
Due to the uncertainty of how long the process will take to complete, and to prevent further delay to providing live data that is compliant with the new data standards, I would like to suggest that a date filter is added to the onboarding toolkit. This would allow data to be visible from a date chosen by the organisation, that would provide data to be visible from the date selected, and not before. This could be added to the section where the resources are made available to consuming organisations.
For example, if Harrogate District Foundation Trust has completed development of resources to the new standard and is ready to go live on 01/09/2022, but the historic data set will not be ready at the same time, the date filter could be set to display only data from 01/09/2022. This would allow HDFT to proceed with the live data feed while the historic data was made compliant. There would need to be a message displayed on the portal to indicate that available data is only visible from the date that has been selected on the filter.
Harrogate have now completed their data migration and all data is now flowing as per the current standards. The work to hold back the data before it was ready was done at the provider end rather than centrally.
A field was also added to the console which allows data providers to state the date from which data will be available for each resource, but this is informational only.
I believe this can now be closed.
Hi Debbie, the status is not ideal, but effectively we're awaiting feedback from Synanetics around the efficiency of the data fixing tool and historical data. Also, there is an assumption in this request that 'new data' will be standards compliant before 'old data' but it would be useful to have clarity from HDFT on their timelines so we could co-ordinate with Synanetics, thereby removing the need to build anything new as suggested by Tim
Hi there - note the status of this idea has moved to 'awaiting feedback' - who is it awaiting feedback from please? Also, do we have a clearer view of fix duration mentioned by Tim below? Appreciate there has been a lot of leave, but also that this idea was submitted nearly a month ago and HDFT are not yet clearer on their ability to go live when live remapping is completed. We have a meeting with HDFT tomorrow. Many thanks,
There are a few ways to skin this cat - but probably the simplest would be to add a filter to the FHIR Appliance itself to filter out certain records. (This could be by date and/or the fixing tool will add "tags" to the fixed records, which might be a way to identify them)
HOWEVER before jumping in, a key consideration is how long the fixing job takes. We don't know... and it MIGHT be quite quick... in which case this ceases to be an issue. (Also, unlike the original load which had to be sequential, I suspect the fixing could be run in parallel - which would give the option of throwing computing "horsepower" at it to speed it up)
There is also potentially a separate issue identified here about informing clinicians about how much history they can see. This is a separate concern - as there are many reasons why history may not be available - data fixing is only one reason. See existing idea https://interweave-portal.ideas.aha.io/ideas/P-I-62 which covers this