Feeds

Hands on with Java XML filter pipelines

Ignored by many

Secure remote control for conventional and virtual desktops

Setting up a pipeline

Once you have your filter set up and compiled, you need to create a pipeline for processing your XML. This should move from input XML document to the filter to the reader. You may even have multiple filters, stacked upon each other. As long as input comes first, and your reader (with application-specific callbacks) comes last, things work fine.

The following listings show how to set up a program to use filters. To do this we have defined a simple XML application (called SimpleXMLApplication). This is a standard SAX Content Handler (it extends the DefaultHandler to obtain the default behaviour). This application merely echoes, to the standard output, the XML passed to it (with suitable indentation).

sax content handler

The XMLFilterTest test harness class links the SimpleXMLFilter and the SimpleXMLAPplication together. Notice that because the one or more filters must sit between input source and the reader, all the operations that you would normally invoke on the reader are invoked on the filter. It then delegates any data that passes through the filter to the reader.

p>java pipeline simple application XMLFilterTest

Also note that we obtain the XMLReader form the root parse and set that as the "parent" of the filter. We then link the filter with the application by making the application the content and document handler of the filter.

The effect of running the XMLFilterTest on a simple XML document is presented below:

effect of running the XMLFilterTest

Data Pollution

However, a word of warning if you use a filter to remove some elements form the data input to the next element in the pipeline. It is all too easy to pollute the data being sent on. For example, consider the case where you don't delegate in the startElement() method for certain data, but forget to do the same in endElement(). The result would be that some elements would never be reported as starting, but would be reported to the reader as ending. This would cause, in the best case, program errors, and in the worst case, data loss or corruption in your application.

Real world use

As an example of a tool that makes extensive use of pipelines of filters, consider the DeltaXML XML diff tool. This uses XML filters to pre and post process XML data before performing XML comparisons, synchronization operations and patches. To download a time limited copy of DeltaXML see the DeltaXML web site here. ®

Remote control for virtualized desktops

More from The Register

next story
Download alert: Nearly ALL top 100 Android, iOS paid apps hacked
Attack of the Clones? Yeah, but much, much scarier – report
You stupid BRICK! PCs running Avast AV can't handle Windows fixes
Fix issued, fingers pointed, forums in flames
Microsoft: Your Linux Docker containers are now OURS to command
New tool lets admins wrangle Linux apps from Windows
Facebook, working on Facebook at Work, works on Facebook. At Work
You don't want your cat or drunk pics at the office
Soz, web devs: Google snatches its Wallet off the table
Killing off web service in 3 months... but app-happy bonkers are fine
First in line to order a Nexus 6? AT&T has a BRICK for you
Black Screen of Death plagues early Google-mobe batch
prev story

Whitepapers

Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
Getting started with customer-focused identity management
Learn why identity is a fundamental requirement to digital growth, and how without it there is no way to identify and engage customers in a meaningful way.
Driving business with continuous operational intelligence
Introducing an innovative approach offered by ExtraHop for producing continuous operational intelligence.
5 critical considerations for enterprise cloud backup
Key considerations when evaluating cloud backup solutions to ensure adequate protection security and availability of enterprise data.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?