♊️ GemiNews 🗞️ (dev)

Demo 1: Embeddings + Recommendation Demo 2: Bella RAGa Demo 3: NewRetriever Demo 4: Assistant function calling

🗞️FHIR Whistle Data Mappings Validation

🗿Semantically Similar Articles (by :title_embedding)

FHIR Whistle Data Mappings Validation

2024-04-16 - Ashwinshetty (from Google Cloud - Medium)

Business ScenarioHealthcare Data Engine(HDE) is a popular GCP based solution to help Healthcare stakeholders transition to FHIR (Fast Healthcare Interoperability Resources). HDE provides pipelines which helps convert non FHIR data to FHIR and reconciles them to form a single Longitudinal Patient Record, which then makes deriving insights from patient data easy and quick.One of the core components of HDE is the Data Mapping Language known as Whistle. This Open Source Data Mapping language is used for converting complex, nested data from one schema to another. For Example, from HL7 to FHIR.This article talks about how to use Whistle to write sample mappings for HL7 data. Run the mapping code locally and then test the resultant converted FHIR format data against a FHIR store.What do we needWe will test the Whistle Mappings to convert sample HL7 data to FHIR. We will be leveraging APIs provided by GCP Healthcare API to ingest some sample HL7 messages to a HL7 store provided by GCP Healthcare API. We will use the schematized variant of this HL7 message from HL7 store and run Whistle mapping code to convert it to FHIR on our local machines. We will then test this converted FHIR data by ingesting it into GCP Healthcare API FHIR storeStepsStep 1 — Enable GCP Cloud Healthcare API.Step 2 — Follow documentation in git repo — Healthcare Data Harmonization to set up Whistle Engine on our local machines. This would need us to install below softwares on our local machines, as we will be using ‘gradle’ to run our Whistle engine application.GitJDK 11.xGradle 7.xStep 3 — Create a Healthcare API Dataset and HL7 store and Fhir store inside that dataset. We will use these resources for our testing.Once created we should see an output like below, where ‘datastore’ is our Healthcare API Dataset, ‘hl7v2store’ is the HL7 store and ‘fhirstore’ is the FHIR store.Step 4 — Let us ingest a sample HL7 message into our HL7 store. Save the below sample message in a file named ‘sample-hl7-msg.hl7’.MSH|^~\&|FROM_APP|FROM_FACILITY|TO_APP|TO_FACILITY|20170703223000||ADT^A01|20170703223000|P|2.5|EVN|A01|20210713083617|PID|1||21004033^^^^MRN||SULIE^BRAN||19941208|M|||444 MAIN ST^^MOUNTAIN SPRINGS^CO^80444||1111111144|2222222244|PV1||I|H44 RM4^1^^HIGHWAY 44 CLINIC||||5144^MARRIE QUINIE|||||||||Y||||||||||||||||||||||||||||20170703223000|The default segment separator in HL7v2 is a carriage return (\r). Most text editors use newline (\n) characters as segment separators. So we will use the below command to replace any \n with \r.sed -z 's/\n/\r/g' sample-hl7-msg.hl7 > sample-hl7-msg-fixed.hl7HL7 store expects input messages to be in base64 encoded string format. So let us use the below command to encode the sample HL7 message.openssl base64 -A -in ./sample-hl7-msg-fixed.hl7 -out ./sample-hl7-msg-base64.txtCopy the encoded string from ‘sample-hl7-msg-base64.txt’ in the below format and save it in a file named ‘hl7v2-sample.json’.{ "message": { "data": "<base64-encoded-string>" }}We will run the below CURL command in a terminal to ingest this message to an HL7 store.curl -X POST \ -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \ -H "Content-Type: application/json; charset=utf-8" \ --data-binary @hl7v2-sample.json \ "https://healthcare.googleapis.com/v1/projects/<gcp-project-name>/locations/<location>/datasets/<dataset-name>/hl7V2Stores/<hl7store-name>/messages:ingest"Once the command is successful, we will get a ‘message.name’ field in the response as shown below.{ "hl7Ack": "<base64-encoded-string>", "message": { "name": "<gcp-project-name>/locations/<location>/datasets/<dataset-name>/hl7V2Stores/<hl7store-name>/messages/<MESSAGE_ID>", }}Using the ‘message.name’ field we will next fetch the schematized message into an output json file. This file will act as an input for our whistle mappings.curl -X GET \ -H "Authorization: Bearer "$(gcloud auth print-access-token) \ -H "Content-Type: application/json; charset=utf-8" \ "https://healthcare.googleapis.com/v1/projects/<project-name>/locations/<location>/datasets/<dataset-name>/hl7V2Stores/<hl7store-name>/messages/<message-name>" \ | jq '.schematizedData.data | fromjson' > <output-filename.json>Step 5 — Let us open any IDE or terminal. We will run below gradle command to trigger mapping, in the directory where github repo was cloned.gradle :runtime:run -q --args="-m $HOME/wstl_codelab/codelab.wstl -i $HOME/wstl_codelab/<output-filename.json>" > converted-fhir.jsonExplanation of the above gradle command:gradle: This invokes the Gradle build automation tool.:runtime:run: This tells Gradle to execute the run task to start the Whistle application.-q: This flag tells Gradle to run in “quiet” mode, suppressing most of the output except for errors. — args: This introduces arguments that will be passed to the run task (and ultimately to the application it starts).-m $HOME/wstl_codelab/codelab.wstl: This argument specifies the path to a whistle file that the application will use for data mapping.-i $HOME/wstl_codelab/<output-filename.json>: This argument points to a JSON file containing input data for the Whistle mapping.Sample Patient Whistle Mapping:This code is just for demo purposes and does not represent the actual FHIR structure. It maps Patient fields like ‘identifier’, ‘name’ and ‘address’ from the PID segment in our input file. These mappings are structured into functions like ‘Build_Identifier’, ‘Build_Name’ and ‘Build_Address’ for better readability.PID_Patient($root.ADT_A01.PID)def PID_Patient(PID){ identifier[]: Build_Identifier(PID.3[]) name[]: Build_Name(PID.5[]) address[]: Build_Address(PID.11[]) active: true resourceType: "Patient"}def Build_Identifier(CX) { value: CX.1}def Build_Name(XPN) { family: XPN.1.1 given[]: XPN.2 given[]: XPN.3}def Build_Address(XAD) { line[]: XAD.2 city: XAD.3 state: XAD.4 postalCode: XAD.5}Step 6 — Once we have the mapped output, we can check if all the fields were converted as per our requirements. Once confirmed, we can try and load this to a FHIR store using the below command.curl -X POST \ -H "Authorization: Bearer $(gcloud auth print-access-token)" \ -H "Content-Type: application/fhir+json" \ -d @converted-fhir.json \ "https://healthcare.googleapis.com/v1/projects/<project-name>/locations/<location>/datasets/<dataset-name>/fhirStores/<fhirstore-name>/fhir/Patient"Post successful completion of the above command, we should be able to see the record in our FHIR store.ConclusionBy following the steps outlined above, we explored a method to validate the HL7 to FHIR conversion workflow utilizing the Open Source Whistle Data Mapping repository. This approach can be readily adapted to validate data conversion workflows involving any other data format to FHIR. This technique proves useful for conducting quick tests, proofs of concept (POCs), or pilot projects for healthcare data conversion to FHIR. Engaging with this process offers a deeper understanding of the capabilities of the powerful Whistle Data Mapping Language.Reference LinksWhistle github repoCloud Healthcare API documentationFHIR Whistle Data Mappings Validation was originally published in Google Cloud - Community on Medium, where people are continuing the conversation by highlighting and responding to this story.

[Blogs] 🌎 https://medium.com/google-cloud/fhir-whistle-data-mappings-validation-cd62c8613a92?source=rss----e52cf94d98af---4 [🧠] [v2] article_embedding_description: {:llm_project_id=>"Unavailable", :llm_dimensions=>nil, :article_size=>10674, :llm_embeddings_model_name=>"textembedding-gecko"}
[🧠] [v1/3] title_embedding_description: {:ricc_notes=>"[embed-v3] Fixed on 9oct24. Only seems incompatible at first glance with embed v1.", :llm_project_id=>"unavailable possibly not using Vertex", :llm_dimensions=>nil, :article_size=>10674, :poly_field=>"title", :llm_embeddings_model_name=>"textembedding-gecko"}
[🧠] [v1/3] summary_embedding_description:
[🧠] As per bug https://github.com/palladius/gemini-news-crawler/issues/4 we can state this article belongs to titile/summary version: v3 (very few articles updated on 9oct24)

🗿article.to_s

------------------------------
Title: FHIR Whistle Data Mappings Validation
[content]
Business ScenarioHealthcare Data Engine(HDE) is a popular GCP based solution to help Healthcare stakeholders transition to FHIR (Fast Healthcare Interoperability Resources). HDE provides pipelines which helps convert non FHIR data to FHIR and reconciles them to form a single Longitudinal Patient Record, which then makes deriving insights from patient data easy and quick.One of the core components of HDE is the Data Mapping Language known as Whistle. This Open Source Data Mapping language is used for converting complex, nested data from one schema to another. For Example, from HL7 to FHIR.This article talks about how to use Whistle to write sample mappings for HL7 data. Run the mapping code locally and then test the resultant converted FHIR format data against a FHIR store.What do we needWe will test the Whistle Mappings to convert sample HL7 data to FHIR. We will be leveraging APIs provided by GCP Healthcare API to ingest some sample HL7 messages to a HL7 store provided by GCP Healthcare API. We will use the schematized variant of this HL7 message from HL7 store and run Whistle mapping code to convert it to FHIR on our local machines. We will then test this converted FHIR data by ingesting it into GCP Healthcare API FHIR storeStepsStep 1 — Enable GCP Cloud Healthcare API.Step 2 — Follow documentation in git repo — Healthcare Data Harmonization to set up Whistle Engine on our local machines. This would need us to install below softwares on our local machines, as we will be using ‘gradle’ to run our Whistle engine application.GitJDK 11.xGradle 7.xStep 3 — Create a Healthcare API Dataset and HL7 store and Fhir store inside that dataset. We will use these resources for our testing.Once created we should see an output like below, where ‘datastore’ is our Healthcare API Dataset, ‘hl7v2store’ is the HL7 store and ‘fhirstore’ is the FHIR store.Step 4 — Let us ingest a sample HL7 message into our HL7 store. Save the below sample message in a file named ‘sample-hl7-msg.hl7’.MSH|^~\&|FROM_APP|FROM_FACILITY|TO_APP|TO_FACILITY|20170703223000||ADT^A01|20170703223000|P|2.5|EVN|A01|20210713083617|PID|1||21004033^^^^MRN||SULIE^BRAN||19941208|M|||444 MAIN ST^^MOUNTAIN SPRINGS^CO^80444||1111111144|2222222244|PV1||I|H44 RM4^1^^HIGHWAY 44 CLINIC||||5144^MARRIE QUINIE|||||||||Y||||||||||||||||||||||||||||20170703223000|The default segment separator in HL7v2 is a carriage return (\r). Most text editors use newline (\n) characters as segment separators. So we will use the below command to replace any \n with \r.sed -z 's/\n/\r/g' sample-hl7-msg.hl7 > sample-hl7-msg-fixed.hl7HL7 store expects input messages to be in base64 encoded string format. So let us use the below command to encode the sample HL7 message.openssl base64 -A -in ./sample-hl7-msg-fixed.hl7 -out ./sample-hl7-msg-base64.txtCopy the encoded string from ‘sample-hl7-msg-base64.txt’ in the below format and save it in a file named ‘hl7v2-sample.json’.{  "message": {    "data": "<base64-encoded-string>"  }}We will run the below CURL command in a terminal to ingest this message to an HL7 store.curl -X POST      \    -H "Authorization: Bearer $(gcloud auth application-default print-access-token)"      \    -H "Content-Type: application/json; charset=utf-8"      \    --data-binary @hl7v2-sample.json      \    "https://healthcare.googleapis.com/v1/projects/<gcp-project-name>/locations/<location>/datasets/<dataset-name>/hl7V2Stores/<hl7store-name>/messages:ingest"Once the command is successful, we will get a ‘message.name’ field in the response as shown below.{  "hl7Ack": "<base64-encoded-string>",  "message": {    "name": "<gcp-project-name>/locations/<location>/datasets/<dataset-name>/hl7V2Stores/<hl7store-name>/messages/<MESSAGE_ID>",    }}Using the ‘message.name’ field we will next fetch the schematized message into an output json file. This file will act as an input for our whistle mappings.curl -X GET \     -H "Authorization: Bearer "$(gcloud auth print-access-token) \     -H "Content-Type: application/json; charset=utf-8" \     "https://healthcare.googleapis.com/v1/projects/<project-name>/locations/<location>/datasets/<dataset-name>/hl7V2Stores/<hl7store-name>/messages/<message-name>" \     | jq '.schematizedData.data | fromjson' > <output-filename.json>Step 5 — Let us open any IDE or terminal. We will run below gradle command to trigger mapping, in the directory where github repo was cloned.gradle :runtime:run -q --args="-m $HOME/wstl_codelab/codelab.wstl -i $HOME/wstl_codelab/<output-filename.json>" > converted-fhir.jsonExplanation of the above gradle command:gradle: This invokes the Gradle build automation tool.:runtime:run: This tells Gradle to execute the run task to start the Whistle application.-q: This flag tells Gradle to run in “quiet” mode, suppressing most of the output except for errors. — args: This introduces arguments that will be passed to the run task (and ultimately to the application it starts).-m $HOME/wstl_codelab/codelab.wstl: This argument specifies the path to a whistle file that the application will use for data mapping.-i $HOME/wstl_codelab/<output-filename.json>: This argument points to a JSON file containing input data for the Whistle mapping.Sample Patient Whistle Mapping:This code is just for demo purposes and does not represent the actual FHIR structure. It maps Patient fields like ‘identifier’, ‘name’ and ‘address’ from the PID segment in our input file. These mappings are structured into functions like ‘Build_Identifier’, ‘Build_Name’ and ‘Build_Address’ for better readability.PID_Patient($root.ADT_A01.PID)def PID_Patient(PID){  identifier[]: Build_Identifier(PID.3[])  name[]: Build_Name(PID.5[])  address[]: Build_Address(PID.11[])  active: true  resourceType: "Patient"}def Build_Identifier(CX) {  value: CX.1}def Build_Name(XPN) {  family: XPN.1.1  given[]: XPN.2  given[]: XPN.3}def Build_Address(XAD) {  line[]: XAD.2  city: XAD.3  state: XAD.4  postalCode: XAD.5}Step 6 — Once we have the mapped output, we can check if all the fields were converted as per our requirements. Once confirmed, we can try and load this to a FHIR store using the below command.curl -X POST \    -H "Authorization: Bearer $(gcloud auth print-access-token)" \    -H "Content-Type: application/fhir+json" \    -d @converted-fhir.json \    "https://healthcare.googleapis.com/v1/projects/<project-name>/locations/<location>/datasets/<dataset-name>/fhirStores/<fhirstore-name>/fhir/Patient"Post successful completion of the above command, we should be able to see the record in our FHIR store.ConclusionBy following the steps outlined above, we explored a method to validate the HL7 to FHIR conversion workflow utilizing the Open Source Whistle Data Mapping repository. This approach can be readily adapted to validate data conversion workflows involving any other data format to FHIR. This technique proves useful for conducting quick tests, proofs of concept (POCs), or pilot projects for healthcare data conversion to FHIR. Engaging with this process offers a deeper understanding of the capabilities of the powerful Whistle Data Mapping Language.Reference LinksWhistle github repoCloud Healthcare API documentationFHIR Whistle Data Mappings Validation was originally published in Google Cloud - Community on Medium, where people are continuing the conversation by highlighting and responding to this story.
[/content]

Author: Ashwinshetty
PublishedDate: 2024-04-16
Category: Blogs
NewsPaper: Google Cloud - Medium
Tags: healthcare-data-engine, fhir-mapping, whistle-data-mapping, google-cloud-platform, data
{"id"=>9350,
"title"=>"FHIR Whistle Data Mappings Validation",
"summary"=>nil,
"content"=>"

Business Scenario

Healthcare Data Engine(HDE) is a popular GCP based solution to help Healthcare stakeholders transition to FHIR (Fast Healthcare Interoperability Resources). HDE provides pipelines which helps convert non FHIR data to FHIR and reconciles them to form a single Longitudinal Patient Record, which then makes deriving insights from patient data easy and quick.

One of the core components of HDE is the Data Mapping Language known as Whistle. This Open Source Data Mapping language is used for converting complex, nested data from one schema to another. For Example, from HL7 to FHIR.

\"\"

This article talks about how to use Whistle to write sample mappings for HL7 data. Run the mapping code locally and then test the resultant converted FHIR format data against a FHIR store.

What do we need

We will test the Whistle Mappings to convert sample HL7 data to FHIR. We will be leveraging APIs provided by GCP Healthcare API to ingest some sample HL7 messages to a HL7 store provided by GCP Healthcare API. We will use the schematized variant of this HL7 message from HL7 store and run Whistle mapping code to convert it to FHIR on our local machines. We will then test this converted FHIR data by ingesting it into GCP Healthcare API FHIR store

Steps

Step 1 — Enable GCP Cloud Healthcare API.

\"\"

Step 2 — Follow documentation in git repo — Healthcare Data Harmonization to set up Whistle Engine on our local machines. This would need us to install below softwares on our local machines, as we will be using ‘gradle’ to run our Whistle engine application.

Step 3 — Create a Healthcare API Dataset and HL7 store and Fhir store inside that dataset. We will use these resources for our testing.

Once created we should see an output like below, where ‘datastore’ is our Healthcare API Dataset, ‘hl7v2store’ is the HL7 store and ‘fhirstore’ is the FHIR store.

\"\"

Step 4 — Let us ingest a sample HL7 message into our HL7 store. Save the below sample message in a file named ‘sample-hl7-msg.hl7’.

MSH|^~\\&|FROM_APP|FROM_FACILITY|TO_APP|TO_FACILITY|20170703223000||ADT^A01|20170703223000|P|2.5|
EVN|A01|20210713083617|
PID|1||21004033^^^^MRN||SULIE^BRAN||19941208|M|||444 MAIN ST^^MOUNTAIN SPRINGS^CO^80444||1111111144|2222222244|
PV1||I|H44 RM4^1^^HIGHWAY 44 CLINIC||||5144^MARRIE QUINIE|||||||||Y||||||||||||||||||||||||||||20170703223000|

The default segment separator in HL7v2 is a carriage return (\\r). Most text editors use newline (\\n) characters as segment separators. So we will use the below command to replace any \\n with \\r.

sed -z 's/\\n/\\r/g' sample-hl7-msg.hl7 > sample-hl7-msg-fixed.hl7

HL7 store expects input messages to be in base64 encoded string format. So let us use the below command to encode the sample HL7 message.

openssl base64 -A -in ./sample-hl7-msg-fixed.hl7 -out ./sample-hl7-msg-base64.txt

Copy the encoded string from ‘sample-hl7-msg-base64.txt’ in the below format and save it in a file named ‘hl7v2-sample.json’.

{
"message": {
"data": "<base64-encoded-string>"
}
}

We will run the below CURL command in a terminal to ingest this message to an HL7 store.

curl -X POST      \\
-H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \\
-H "Content-Type: application/json; charset=utf-8" \\
--data-binary @hl7v2-sample.json \\
"https://healthcare.googleapis.com/v1/projects/<gcp-project-name>/locations/<location>/datasets/<dataset-name>/hl7V2Stores/<hl7store-name>/messages:ingest"

Once the command is successful, we will get a ‘message.name’ field in the response as shown below.

{
"hl7Ack": "<base64-encoded-string>",
"message": {
"name": "<gcp-project-name>/locations/<location>/datasets/<dataset-name>/hl7V2Stores/<hl7store-name>/messages/<MESSAGE_ID>",
}
}

Using the ‘message.name’ field we will next fetch the schematized message into an output json file. This file will act as an input for our whistle mappings.

curl -X GET \\
-H "Authorization: Bearer "$(gcloud auth print-access-token) \\
-H "Content-Type: application/json; charset=utf-8" \\
"https://healthcare.googleapis.com/v1/projects/<project-name>/locations/<location>/datasets/<dataset-name>/hl7V2Stores/<hl7store-name>/messages/<message-name>" \\
| jq '.schematizedData.data | fromjson' > <output-filename.json>

Step 5 — Let us open any IDE or terminal. We will run below gradle command to trigger mapping, in the directory where github repo was cloned.

gradle :runtime:run -q --args="-m $HOME/wstl_codelab/codelab.wstl -i $HOME/wstl_cod
elab/<output-filename.json>" > converted-fhir.json
Explanation of the above gradle command:
gradle: This invokes the Gradle build automation tool.
:runtime:run: This tells Gradle to execute the run task to start the Whistle application.
-q: This flag tells Gradle to run in “quiet” mode, suppressing most of the output except for errors.
— args: This introduces arguments that will be passed to the run task (and ultimately to the application it starts).
-m $HOME/wstl_codelab/codelab.wstl: This argument specifies the path to a whistle file that the application will use for data mapping.
-i $HOME/wstl_codelab/<output-filename.json>: This argument points to a JSON file containing input data for the Whistle mapping.

Sample Patient Whistle Mapping:

This code is just for demo purposes and does not represent the actual FHIR structure. It maps Patient fields like ‘identifier’, ‘name’ and ‘address’ from the PID segment in our input file. These mappings are structured into functions like ‘Build_Identifier’, ‘Build_Name’ and ‘Build_Address’ for better readability.
PID_Patient($root.ADT_A01.PID)

def PID_Patient(PID){
identifier[]: Build_Identifier(PID.3[])
name[]: Build_Name(PID.5[])
address[]: Build_Address(PID.11[])
active: true
resourceType: "Patient"
}

def Build_Identifier(CX) {
value: CX.1
}

def Build_Name(XPN) {
family: XPN.1.1
given[]: XPN.2
given[]: XPN.3
}

def Build_Address(XAD) {
line[]: XAD.2
city: XAD.3
state: XAD.4
postalCode: XAD.5
}

Step 6 — Once we have the mapped output, we can check if all the fields were converted as per our requirements. Once confirmed, we can try and load this to a FHIR store using the below command.

curl -X POST \\
-H "Authorization: Bearer $(gcloud auth print-access-token)" \\
-H "Content-Type: application/fhir+json" \\
-d @converted-fhir.json \\
"https://healthcare.googleapis.com/v1/projects/<project-name>/locations/<location>/datasets/<dataset-name>/fhirStores/<fhirstore-name>/fhir/Patient"

Post successful completion of the above command, we should be able to see the record in our FHIR store.

\"\"

Conclusion

By following the steps outlined above, we explored a method to validate the HL7 to FHIR conversion workflow utilizing the Open Source Whistle Data Mapping repository. This approach can be readily adapted to validate data conversion workflows involving any other data format to FHIR. This technique proves useful for conducting quick tests, proofs of concept (POCs), or pilot projects for healthcare data conversion to FHIR. Engaging with this process offers a deeper understanding of the capabilities of the powerful Whistle Data Mapping Language.

Reference Links

Whistle github repo

Cloud Healthcare API documentation

\"\"

FHIR Whistle Data Mappings Validation was originally published in Google Cloud - Community on Medium, where people are continuing the conversation by highlighting and responding to this story.

",
"author"=>"Ashwinshetty",
"link"=>"https://medium.com/google-cloud/fhir-whistle-data-mappings-validation-cd62c8613a92?source=rss----e52cf94d98af---4",
"published_date"=>Tue, 16 Apr 2024 00:06:38.000000000 UTC +00:00,
"image_url"=>nil,
"feed_url"=>"https://medium.com/google-cloud/fhir-whistle-data-mappings-validation-cd62c8613a92?source=rss----e52cf94d98af---4",
"language"=>nil,
"active"=>true,
"ricc_source"=>"feedjira::v1",
"created_at"=>Tue, 16 Apr 2024 19:08:49.486710000 UTC +00:00,
"updated_at"=>Mon, 21 Oct 2024 20:03:37.418771000 UTC +00:00,
"newspaper"=>"Google Cloud - Medium",
"macro_region"=>"Blogs"}
Edit this article
Back to articles