JPX is an extension of the JPEG2000 image standard file format (jp2) and can be defined as the container which allows to describe the jp2 image
and the associated metadata.
The objective of this format can be described in the following list :
The interesting part for our goals is the definition of a set of metadata for image description and the specification of a container able to store image metadata.
- Specify extended decoding processes for converting compressed image data to reconstructed image data
- Specify an extended code-stream syntax containing information for interpreting the compressed image data.
- Specify an extended file format.
- Specify a container to store image metadata.
- Define a standard set of image metadata.
- Provide guidance on extended encoding processes for converting source image data to compressed image data.
- Provide guidance on how to implement these processes in practice.
Starting from the DIG35 standard, on which it is largely based, it defines an XML language that allows to represent a complete set of metadata related to the image.
The metadata elements specify information such as how the image was created, captured or digitised, or how the image has been edited since
it was originally created, including the intellectual property rights information, as well as the content of the image, such as the names of the people and places in the image.
They can be grouped into four different sections and a common section for the types definition:
JPX is composed of several boxes, among which two are very interesting:
- Image Creation Metadata: which defines the how metadata that specifies the source of which the image was created.
For example, the camera and lens information and capture condition are useful technical information for professional and serious amateur photographers
as well as advanced imaging applications.
- Content Description Metadata: which defines the descriptive information of who , what , when and where aspect
of the image. Often this metadata takes the form of extensive words, phrases, or sentences to describe a particular event or location that the image illustrates.
Typically, this metadata consists of text that the user enters, either when the images are taken or scanned or later in the process during manipulation or use of the images.
- History Metadata: The Metadata History is used to provide partial information about how the image got to the present state.
For example, history may include certain processing steps that have been applied to an image. Another example of a history would be the image creation
events including digital capture, exposure of negative or reversal films, creation of prints, or reflective scans of prints.
All of these metadata are important for some applications. To permit flexibility in construction of the image history metadata,
two alternate representations of the history are permitted. In the first, the history metadata is embedded in the image metadata.
In the second, the previous versions of the image, represented as a URL/URI, are included in the history metadata as pointers to the location of the actual history.
The history metadata for a composite image (i.e., created from two or more previous images) may also be represented through a hierarchical metadata structure.
While this specification does not define the how or how much part of the processing aspect, it does enable logging of certain
processing steps applied to an image as hints for future use.
- Intellectual Property Rights Metadata (IPR): which defines metadata to either protect the rights
of the owner of the image or provide further information to request permission to use it. It is important for developers and users to understand
the implications of intellectual property and copyright information on digital images to properly protect the rights of the owner of the image data.
- Fundamental Metadata Types and Elements: which define common data types that may be used within each metadata groups.
Those include an address type or a person type which is a collection of other primitive data types. The Fundamental metadata elements define elements
the are commonly referenced within other metadata groups. These include a definition for language specification and a timestamp.
This first box contains metadata in MPEG-7 binary format (BiM) as described in the MPEG-7 section, while the second box, already defined within
the jp2 file format specifications allows to create metadata description schemes able to represent the considered MM data.
- the MPEG-7 Binary box.
- XML box.
An interesting initiative is represented by the mapping between Dublin Core element set and the XML metadata used
in JPX files .