Loading data from a web service. An alternate approach

I was recently working on a project where my client had just implemented a new system, and now they wanted to utilise the provided web services on this platform for some more reporting that wasn't provided out of the box. Being we use APEX, we had the nice APEX_WEB_SERVICE API to use - which made things nice and simple, but before that we needed to perform a couple of extra steps, which is required each time you want to pull data over the web from another domain.

It's actually, I feel, too complicated to get going - but I understand these restrictions are in place to keep systems nice and secure.

In a nut shell, we should be dealing with resources secured with some certificates - so we need to create both an ACL to the resource followed by creating a wallet with all the certificates the site uses. After that, we are good to go - until some time down the track when the service updates there certificates, then we once again need to update our wallet with the sites certificates.

Now, we got everything all working - the web service returns an XML feed. We parse the result and load it into tabular format into one of our tables. This was set up to run once a day - real time data isn't a strong requirement.

Given we are only running this once a day, I thought another approach could be the leverage SQLcl. It supports running Java and JavaScript through it's scripting engine. So, first I fired up my Java IDE to prototype performing a GET request, and came up with the following:

CredentialsProvider provider = new BasicCredentialsProvider();
UsernamePasswordCredentials creds = new UsernamePasswordCredentials("fakeuser", "fakepassword");
provider.setCredentials(AuthScope.ANY, creds);

HttpClient client = HttpClientBuilder.create()
        .setDefaultCredentialsProvider(provider)
        .build();
HttpGet get = new HttpGet("https://example.com/path/to/resource");

try {

    HttpResponse resp = client.execute(get);

    System.out.println(resp.getStatusLine().getStatusCode());

    HttpEntity entity = resp.getEntity();

    String respBody = EntityUtils.toString(entity);
    System.out.println(respBody);

} catch (IOException e ) {
    e.printStackTrace();
}


That worked well, so now I needed to convert this into a script that could be used with nashorn. So I figured out all the classes I needed, and added the following to a JavaScript (effectively, import statements):

//All the required Java clasees
var CredentialsProvider = Java.type("org.apache.http.client.CredentialsProvider"),
    BasicCredentialsProvider = Java.type("org.apache.http.impl.client.BasicCredentialsProvider"),
    UsernamePasswordCredentials = Java.type("org.apache.http.auth.UsernamePasswordCredentials"),
    AuthScope = Java.type("org.apache.http.auth.AuthScope"),
    HttpClient = Java.type("org.apache.http.client.HttpClient"),
    HttpClientBuilder = Java.type("org.apache.http.impl.client.HttpClientBuilder"),
    HttpGet = Java.type("org.apache.http.client.methods.HttpGet"),
    HttpResponse = Java.type("org.apache.http.HttpResponse"),
    HttpEntity = Java.type("org.apache.http.HttpEntity"),
    EntityUtils = Java.type("org.apache.http.util.EntityUtils"),
    HashMap = Java.type("java.util.HashMap");


So, with all those now available in my script, I could begin copying the same implementation that I had.

Once I had my response coming through, I needed to add it to a HashMap, for a Clob datatype (I discovered that SQLcl cuts off values larger than 4000 characters, so it seems a Clob was necessary - especially if the web service response becomes too large).

I took a look at some example code, and saw an example with a blob with `conn.createBlob()`. So I copied this style but used it for a clob instead, as per the following example:

map = new HashMap();
xmlAsClob = conn.createClob();
xmlAsClob.setString(1, respBody);

map.put("xml", xmlAsClob);


This map allows us to use a bind value (xml) with our query using the provided function `util.execute`.

So, for this basic example, I just wanted to log the count that was returned in the XML to run each day. So, for the insert statement, I came up with the following query and implementation:

insertCountQuery =
"insert into org_count (date_run, count)" +
"select " +
"    sysdate" +
"  , orgs.org_count " +
"from " +
"    xmltable( " +
"        '/Result' " +
"        passing xmltype.createxml(:xml) " +
"            columns " +
"                org_count number path '/Result/count' " +
"    ) orgs";

insertCountResult = util.execute(insertCountQuery, map);

if (insertCountResult){
    print("Successfully updated from the web service");
} else {
    print ("An error occurred");
    print ("sqldev.last.err.message.forsqlcode reported:")
    print (ctx.getProperty("sqldev.last.err.message.forsqlcode"));
}


I saved this script as `fetchAndLoad.js`. So when I'm connected to SQLcl, I just need to run the command: `script fetchAndLoad.js`. And voila - our data is sent into your table.

To get this to run daily, it would be a good idea to wrap this in a little shell script and add it to your crontab (or task scheduler if you're on windows).

Take a look at the full example code: https://github.com/tschf/orcl-wssync-sqlcl-demo

Popular posts from this blog

Report row buttons firing a dynamic action

Accessing the last request value from a page submission

Installing Oracle Instant Client on Ubuntu