Hands on with Java XML filter pipelines

Ignored by many

Choosing a cloud hosting partner with confidence

Typical uses of Filters

Some typical uses of filters include:

• Normalization of whitespace in which contiguous whitespace PCDATA is replaced by a single space.

• Ignoring information in the originating XML document.

• Modifying elements in the original XML document.

• Adding data to the elements in the original XML document.

The following example illustrates how a filter can be used to modify an actual element such that the element passed to the processing application differs form that define din the original XML document.

A simple filter

The following listing shows a very simple SAX filter that changes all postal code elements into postcode elements. This effectively pre-processes the input document to modify one element, while allowing all other elements to pass through unchanged. This filter is presented in the following listing.

A Simple XMLFilter class

To be a Java XML filter a class must either implement the XMLFilter interface or extend the XMLFilterImpl class. In our case we are extending the XMLFilterImpl class from the org.xml.sax.helpers package as this means that we only need to implement those methods that will actually do something, (all other methods required by the XMLFilter interface are provided by inheritance from the XMLFilterImpl class). This keeps our code cleaner, and requires less work on our part.

We now free to implement only the methods startElement and endElement (be careful to make sure that the method signatures are the same as those in the XMLReader interface, otherwise you will be overloading the methods rather than overriding them, which will mean that your code will not be called).

Our startElement and endElement methods change the localName and the qName (qualified name) of the element to from the American postal code to the British postcode if the element postal code is found. Otherwise they just pass the data through unaltered.

Be careful to call the inherited super class methods once you have finished processing your data. This will enable the pipelining behaviour to be invoked.

SAX Processing Rules

You may wonder why we set both values. This has to do with rules regarding SAX processing and the state of the following two SAXParserFactory properties:

http://xml.org/sax/features/namespaces and the

http://xml.org/sax/features/namespace-prefixes properties.

Essentially, these rules say that:

1. the Namespace URI and local name are required when the namespaces property is true (the default), and are optional when the namespaces property is false (if one is specified, both must be);

2. the qualified name is required when the namespace-prefixes property is true, and is optional when the namespace-prefixes property is false (the default).

To handle these situations we are setting both parameters to the new element name. This is also why we test both parameters.

Choosing a cloud hosting partner with confidence

More from The Register

next story
Download alert: Nearly ALL top 100 Android, iOS paid apps hacked
Attack of the Clones? Yeah, but much, much scarier – report
NSA SOURCE CODE LEAK: Information slurp tools to appear online
Now you can run your own intelligence agency
Microsoft: Your Linux Docker containers are now OURS to command
New tool lets admins wrangle Linux apps from Windows
Microsoft adds video offering to Office 365. Oh NOES, you'll need Adobe Flash
Lovely presentations... but not on your Flash-hating mobe
You stupid BRICK! PCs running Avast AV can't handle Windows fixes
Fix issued, fingers pointed, forums in flames
HTML5 vs native: Harry Coder and the mudblood mobile app princes
Developers just want their ideas to generate money
prev story


Driving business with continuous operational intelligence
Introducing an innovative approach offered by ExtraHop for producing continuous operational intelligence.
Why CIOs should rethink endpoint data protection in the age of mobility
Assessing trends in data protection, specifically with respect to mobile devices, BYOD, and remote employees.
Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Reducing the cost and complexity of web vulnerability management
How using vulnerability assessments to identify exploitable weaknesses and take corrective action can reduce the risk of hackers finding your site and attacking it.