查找

Article
· Mars 18, 2024 2m de lecture

Getting data from InterSystems IRIS CloudSQL using xDBC

Recently, the question came up while discussing the access to the data stored in IRIS from different languages with my students if it was possible to initiate the connection and get data from Cloud solution (InterSystems IRIS CloudSQL) from Microsoft Excel, not the other way around. Considering the many varied ways one can get data in Excel (import data from external sources, connecting to databases using ODBC drivers, using power queries and web queries etc.) the obvious choice was to try ODBC driver. The only task left was to try to connect to the database in the cloud using the ODBC driver.

Quite unsurprisingly, it worked!

If you're new to CloudSQL, I would highly suggest you read the documentation. From it you will be able to get access to the InterSystems Cloud Services Portal (or you can go directly to AWS and subscribe with your AWS account) and download necessary drivers. BTW, if you encounter problems with CloudSQL, you may try to look for answers in this document first.

Now that these preliminaries are out of the way, let's try the simplest approach - setting up the ODBC data source with just a password and getting data from it in MS Excel.

Step 0. Set up your InterSystems IRIS CloudSQL. You will see the necessary settings on the Overview page of your Deployment:

Step 00. Download and install ODBC driver for your OS.

Step 1. Open your ODBC Data Sources and switch to System DSN tab. In it, click on Add...

and fill in the settings from your Overview page (first screenshot of the article).

Step 2. Connect Excel to IRIS. Open Excel, navigate to the Data tab, and select "Get Data" or "From Other Sources," depending on your Excel version. Choose "From ODBC" as the data source

select the ODBC data source you configured earlier

enter authentication credentials if asked

and choose the table you wish to load into Excel

Step 3. You're good to go and do whatever with your data

PS. Some other interesting articles/videos that I would suggest you read/watch regarding ODBC and CloudSQL:

  • a whole series of videos that show how to connect to CloudSQL from solutions written in different languages
  • an interesting article on how to set up roles and give permissions to xDBC connections
  • another article on how to use SSL/TLS to establish secure connections over JDBC and other driver technologies
  • and the last one about switching between ODBC drivers in an environment with multiple versions installed
4 Comments
Discussion (4)2
Connectez-vous ou inscrivez-vous pour continuer
Article
· Mars 18, 2024 3m de lecture

任意のXMLドキュメントの読み込み・書き出しを行う

これは、InterSystems FAQサイトの記事です。

【任意のXMLドキュメントの読み込み】
任意のXMLドキュメントの読み込みを行うには、%XML.TextReaderクラスを使用します。
Parseメソッド(※ドキュメントがファイルの場合はParseFile())を使用してドキュメントをパースし、各ノードのプロパティを取得します。

例えば、下記のXMLの場合、

 <emp empid="1">
    <name>Suzuki</name>
    <address>Tokyo</address>
 </emp>

 
各赤枠が、"ノード"の単位となり、


 下記のようなイメージで取得することができます。

ノード・
プロパティ名
seq NodeType Name Value (属性)
LocalName Value
プロパティ値 1 element emp   empid 1
2 element name      
3 chars   Suzuki    
4 endelement name      
5 element address      
6 chars   Tokyo    
7 endelement address      
8 endelement emp      

 コード例:

readXML
  set sc=##class(%XML.TextReader).ParseFile("C:\temp\aaa.xml",.treader)
  d $SYSTEM.Status.DisplayError(sc)
  while (treader.Read()) {
    write treader.seq," "
    write "[Type]",treader.NodeType," " 
    write "[Name]",treader.Name," "
    write "[Value]",treader.Value," "
    if (treader.NodeType="element"){
      for i=1:1:treader.AttributeCount {
        do treader.MoveToAttributeIndex(i)
        write "[Att] ",treader.LocalName,"=",treader.Value
      }
    }
  write !
  }
  quit


%XML.TextReaderについての詳細は、下記のドキュメントをご参照ください。
[ドキュメント] %XML.TextReader の使用

【任意のXMLドキュメントの書き出し】
任意のXMLドキュメントを作成(書き出し)するには、%XML.Writerを使用します。

コード例:

writeXML
    set xml=##class(%XML.Writer).%New()
    set xml.Indent=1
    do xml.OutputToFile("C:\temp\out.xml")
    do xml.RootElement("employees")
    do xml.Element("emp"),xml.WriteAttribute("empid","1")
    do xml.Element("name"),xml.WriteChars("Suzuki"),xml.EndElement()
    do xml.EndElement() // emp
    do xml.EndRootElement() // employees
    quit

 
上記を実行すると、下記の内容のファイルが出力されます。

<?xml version="1.0" encoding="UTF-8"?>
<employees>
  <emp empid="1">
    <name>Suzuki</name>
  </emp>
</employees>


%XML.Writerについての詳細は、下記のクラスリファレンスをご参照ください。
[クラスリファレンス] %XML.Writer

Discussion (0)1
Connectez-vous ou inscrivez-vous pour continuer
Article
· Mars 16, 2024 4m de lecture

Creating Unit Tests in ObjectScript for HL7 pipelines using %UnitTest class

One of the pain points for maintaining HL7 interfaces is the need to run a reliable regression test upon deployment to new environments and after upgrades. The %UnitTest class allows unit tests to be created and packaged alongside interface code. Test data can also be maintained within the unit test class, allowing for quick and easily repeatable smoke-testing and regression testing.

Resources:

  • Standard %UnitTest class

1 Comment
Discussion (1)1
Connectez-vous ou inscrivez-vous pour continuer
Article
· Mars 15, 2024 4m de lecture

Uncovering Clues by Querying the Interoperability Message tables

When using InterSystems IRIS as an interoperability engine, we all know and love how easy it is to use the Message Viewer to review message traces and see exactly what's going on in your production. When a system is handling millions of messages per day, you may not know exactly where to begin your investigation though.

Over my years supporting IRIS productions, I often find myself investigating things like...

  • What sort of throughput does this workflow have?
  • Where is the bottleneck?
  • What are my most common errors?

One of my favorite places to look for clues is the Message Header table, which stores metadata about every message running through the system. These are the same messages that appear in the Message Viewer and the Visual Traces. 

I've built up a collection of handy SQL queries, and I'd love to share them with you. My examples are mostly from HealthShare or IRIS for Health use cases, but they can be easily adapted for whatever workflow you have...

-- SQL query to find the # of messages through a component per day
select {fn SUBSTRING(timeprocessed,1,10)} AS day, count(*) MessagesThisDay 
FROM Ens.MessageHeader
where TargetConfigName = 'HS.Hub.Push.Evaluator' 
GROUP BY {fn SUBSTRING(timeprocessed,1,10)}
ORDER BY day ASC
-- SQL query to find long-running messages through particular components
SELECT PReq.SessionID as SessionId, 
  PReq.TimeCreated as pReqTimeCreated, 
  PRes.TimeCreated as pResTimeCreated, 
  {fn TIMESTAMPDIFF(SQL_TSI_SECOND, PReq.TimeCreated,PRes.TimeCreated)} as TimeDelay
FROM (
  SELECT ID, SessionId, TimeCreated
  FROM Ens.MessageHeader
  WHERE MessageBodyClassName = 'HS.Message.PatientSearchRequest'
  AND SourceConfigName = 'HS.Hub.MPI.Manager'
  AND TargetConfigName = 'HUB'
) as PReq
INNER JOIN (
  SELECT ID, SessionId, TimeCreated
  FROM Ens.MessageHeader
  WHERE MessageBodyClassName = 'HS.Message.PatientSearchResponse'
  AND SourceConfigName = 'HS.Hub.MPI.Manager'
  AND TargetConfigName = 'HS.IHE.PIX.Manager.Process'
) as PRes on pReq.SessionId = PRes.SessionId
WHERE {fn TIMESTAMPDIFF(SQL_TSI_SECOND, PReq.TimeCreated,PRes.TimeCreated)} > 1
ORDER BY SessionId desc ----------------------------------------------------------
/*-- Query to find the bottleneck message through a particular component
  -- set your threshold for "how long is too long (e.g. 20 seconds)
  -- look for clusters of messages that are longer than that (e.g. the first cluster started at 3:22:00, then there was a second cluster at 5:15:30)
  -- in each cluster, look at the first message in that cluster (chronologically). That is likely to be the bottleneck message, and all messages after it are victims of its bottleneck 
*/
SELECT %NOLOCK req.TargetConfigName, req.MessageBodyClassName, req.SessionId, req.TimeCreated, req.TimeProcessed, {fn TIMESTAMPDIFF(SQL_TSI_SECOND, req.TimeCreated, req.TimeProcessed)} as TimeToProcess
FROM Ens.MessageHeader AS req
WHERE req.TargetConfigName = 'HS.Hub.Management.Operations'
  AND req.TimeCreated BETWEEN '2021-04-21 00:00:00' AND '2021-04-21 11:00:00'
  AND {fn TIMESTAMPDIFF(SQL_TSI_SECOND, req.TimeCreated, req.TimeProcessed)} > 20
/* If you have a particular error that you're investigating, try this one. It scans through the Ensemble Error Log for "Object to Load not found" entries, then returns some key fields from the relevant PatientSearchRequest message */
SELECT l.SessionId, mh.MessageBodyID, mh.TimeCreated, psr.SearchMode, psr.RequestingUser, psr.FirstName, psr.MiddleName, psr.LastName, psr.SSN, psr.Sex, psr.DOB
FROM Ens_Util.Log as l
INNER JOIN Ens.MessageHeader as mh on l.SessionId = mh.SessionId
INNER JOIN HS_Message.PatientSearchRequest as psr on mh.MessageBodyID = psr.ID
WHERE l.Type = 'Error'
AND l.ConfigName = 'HSPI.Server.APIOperation'
AND l.Text like 'ERROR #5809: Object to Load not found%'
AND mh.MessageBodyClassName = 'HS.Message.PatientSearchRequest'
AND mh.SourceConfigName = 'HSPI.Server.APIWebService'
AND mh.TargetConfigName = 'HSPI.Server.APIOperation'
-- Scan the Ensemble Error Log for a particular timeframe. Count up the different types of errors
SELECT substring(text,1,80) as AbbreviatedError, count(*) as NumTheseErrors
FROM Ens_Util.Log
WHERE Type = 'Error'
AND TimeLogged > '2022-03-03 00:00:00' -- when the last batch started
AND TimeLogged < '2022-03-03 16:00:00' -- when we estimate this batch might end
GROUP BY substring(text,1,80)
ORDER BY NumTheseErrors desc
-- Find the Gateway Processing Time for each StreameltRequest / ECRFetchResponse pair
SELECT sr.Gateway,request.sessionid, response.sessionid, request.timecreated AS starttime, response.timecreated AS stoptime, 
  datediff(ms,request.timecreated,response.timecreated) AS ProcessingTime, 
  Avg(datediff(ms,request.timecreated,response.timecreated)) AS AverageProcessingTimeAllGateways
FROM Ens.MessageHeader request
INNER JOIN Ens.MessageHeader AS response ON response.correspondingmessageid = request.id
INNER JOIN HS_Message.StreamletRequest AS sr ON sr.ID = request.MessageBodyId
WHERE request.messagebodyclassname = 'HS.Message.StreamletRequest'
AND response.messagebodyclassname = 'HS.Message.ECRFetchResponse'
2 Comments
Discussion (2)1
Connectez-vous ou inscrivez-vous pour continuer
Article
· Mars 14, 2024 7m de lecture

Tutorial: Adding OpenAI to Interoperability Production

Artificial Intelligence (AI) is getting a lot of attention lately because it can change many areas of our lives. Better computer power and more data have helped AI do amazing things, like improving medical tests and making self-driving cars. AI can also help businesses make better decisions and work more efficiently, which is why it's becoming more popular and widely used. How can one integrate the OpenAI API calls into an existing IRIS Interoperability application?

 

Prerequisites

In this tutorial we will assume that you already have an existing interoperability production and a set of OpenAI credentials to make calls to OpenAI APIs. You can download a code we use in this tutorial from the following GitHub project branch: https://github.com/banksiaglobal/bg-openai/tree/test-app-original
To learn how to get OpenAI credentials, follow this tutorial https://allthings.how/how-to-get-your-open-ai-api-key/ or just open OpenAI API Keys page and create one https://platform.openai.com/api-keys

Original Application

 

Our application, AppExchange, emulates InterSystems OpenExchange publishing: it gets a request with a project description, project logo and GitHub URL and publishes it in the AppExchange repository.

 

Adding a bit of Artificial Intelligence

Now let's assume that a person who looks after our repository noticed that some app developers are lazy and not providing either short summary or logo for the apps they are publishing. That's where our AI friend can come to the rescue!

The desired workflow would look like this:

  1. The application receives a URL of a repository, summary and a URL of logo as input.
  2. If summary is empty, the URL is sent to a GPT-based model that parses the repository contents and generates a descriptive summary of the project. This process may involve parsing README files, code comments, and other relevant documentation within the repository to extract key information about the project's purpose, features, and usage.
  3. The generated project summary is then used as an input to another GPT-based model, which is tasked with creating a logo for the project. This model uses the description to understand the project's theme, and then designs a logo that visually represents the project's essence and identity.
  4. The application outputs a response that includes the original URL, the generated project summary, and the newly created logo. This response provides a comprehensive overview of the project, along with a visual identifier that can be used in branding and marketing efforts.

To achieve this integration, we will use the Business Process Designer to visually design the application's workflow.

 

Step 1: Installation

To start, we will download bg-openai package from Open Exchange using ZPM package manager:

zpm "install bg-openai"


You can have a look at this package here https://openexchange.intersystems.com/package/bg-openai-1 and check out it's source code here https://github.com/banksiaglobal/bg-openai

This package is based on the great work of Francisco Lopez available here https://github.com/KurroLopez/iris-openai with four small changes: we changed class names to be more in line with standard IRIS naming conventions, we added a new SimplePrompt request which allows users to send simple AI text prompts very easily, we changed Api Key to be a credential rather than a setting, and we changed top level package name to "Banksia" in line with company standards.

 

Step 2: Set up OpenAI Operation

For further work and configuration of the products, let's move to the management portal located at the following link if you are using Docker image with our original application:

http://localhost:42773/csp/sys/UtilHome.csp


Navigate to the Interoperability->[Namespace]->Configure->Production and make sure that our original production is running.

Add a new Operation based on class Banksia.OpenAi.Operation and name it OpenAiOut. Make it enabled. This operation will communicate with OpenAI API servers.

  • Operation Class: Banksia.OpenAi.Operation
  • Operation Name: OpenAiOut

 

 

Now let's make a minimal setup required to use our new Operation in Production: add an API key and SSL Configuration.

Navigate to OpenAiOut->Settings->Basic Settings->Credentials and click on magnifying glass icon to configure credentials.

 Fill in the form data and add apiKey in the password field. Save the data by clicking on SaveID and User Name fields you can fill as you like. 

 

In the Credentials field, select the ID of the credentials we saved earlier.

 

 

Setup SSL Configuration: create a new Client SSL Configuration OpenAiSSL and select it in the dropdown.

 

 

 

Step 3 - Add Summary Generation to Business Process using the Business Process Designer

Navigate to Interoperability > Business Process Designer  and open AppExchange.Process
business process by clicking Open.

Build a flowchart of the process based on the algorithm we described above.
An example implementation is shown in the image below.

 

Сheck the repository URL is provided and that we need to query ChatGPT to create a description if no description has been entered.

(request.Summary="") & (request.GitHubUrl '="")

   

Then, add the <Сall> block and make a target OpenAiOut, which, depending on the type of request, will make a call to OpenAi api.

  •  Name: Generate Summary 

   

Customize the type of request and the received response, as well as distribute variables for actions.

  • Request Message Class: Banksia.OpenAi.Msg.SimplePrompt.Request

set  callrequest.Prompt   = "Visit the website you will be provided on the next step. Describe the main idea of the project, its objectives and key features in one paragraph." 

set callrequest.UserInput = request.GitHubUrl 

set callrequest.Model = "gpt-4" 

  • Response Message Class: Banksia.OpenAi.Msg.SimplePrompt.Response

set request.Summary = callresponse.Content 

 

Add a <sync> step to wait for a response, in the Calls field add the name of the previous <call> 

  • Calls: Generate Summary

 

Step 4 - Add Logo Generation to Business Process

 

After getting the repository description, let's move on to the next logical part - logo generation. Let's check that there is a description for which the image will be generated and check if there is no image URL provided. Let's set the following condition:

(request.LogoUrl="") & (request.Summary'="")

 

Сonfigure the next <call> element, make a target our OpenAiOut  operation as well.

  •  Name: Generate Logo

 

 

Customize the type of request and the received response.

  • Request Message Class: Banksia.OpenAi.Msg.Images.Request

set  callrequest.ResponseFormat  = "url"

set  callrequest.Operation  = "generations"

set  callrequest.Prompt  = "Create a simple app icon for the following mobile application: "_request.Summary

set  callrequest.Size  = "256x256"

  • Response Message Class: Banksia.OpenAi.Msg.Images.Response

set  request.LogoURL  = callresponse.Data.GetAt(1).Url

    

After completing the modification of our business process, click the compile button. 

You can download the finished OpenAI integrated sample from the following GitHub project branch: https://github.com/banksiaglobal/bg-openai/tree/test-app

 

Step 5: Test our new Business Process in Production

Go to the Interoperability->Configure->Production section

 

First we need to restart our process to apply all the latest changes, navigate to AppProcess->Actions->Restart.

To test the process, go to AppProcess->Actions->Test.
Create a test message with a GitHub URL for the OpenAI API and send it through production:

 

 

Verify that the response from the OpenAI API is received and processed correctly by the application. Go to Visual Trace to see the full application cycle and make sure that the correct data is transmitted in each process element.

 

 

This is AI's take on our app logo:

 

Conclusion

By following these steps, you can integrate the OpenAI API into the interoperability production using the Business Process in InterSystems IRIS. The bg-openai module is a great resource for developers looking to incorporate AI into their applications. By simplifying the integration process, it opens up new possibilities for enhancing applications with the power of artificial intelligence.

 

About Author

Mariia Nesterenko is a certified IRIS developer at Banksia Global. She is involved in application development, data structures, system interoperability, and geospatial data.

About Banksia Global

Banksia Global is an international boutique consultancy headquartered in Sydney, Australia, specializing in providing professional services for InterSystems technologies. With a team of dedicated and experienced professionals, we pride ourselves on being an official InterSystems Premier Partner, authorized to provide services worldwide. Our passion for excellence and innovation drives us to deliver high-quality solutions that meet the unique needs of our clients.

6 Comments
Discussion (6)2
Connectez-vous ou inscrivez-vous pour continuer