Last active
November 8, 2022 07:55
-
-
Save sebilasse/8ae16c68341d994a8094274ef4b95945 to your computer and use it in GitHub Desktop.
Image Proxy for ActivityPub
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
// see NOTE https://github.com/Rob--W/cors-anywhere/issues/254#issuecomment-659037020 | |
const corsAnywhere = require('cors-anywhere'); | |
const express = require('express'); | |
const apicache = require('apicache'); | |
const sharp = require('sharp'); | |
const path = require('path'); | |
const expressHttpProxy = require('express-http-proxy'); | |
// const xmpParser = import('./exifr/src/'); | |
const exifr = require('exifr'); | |
/* DOCUMENTATION | |
This is a WIP proxy with media manipulation support, | |
e.g. | |
http://localhost:8080/w/200/rotate/90/png/80_lossless/https://cdn.prod.www... | |
for now for images, | |
chainable paths - chain up to 5 Resize/Image operations + 1 Output operation: | |
Resizing images; | |
-> {size} is the integer width/height in pixels or an explicit {width}x{height} string (evt. changes aspect ratio) | |
/w/{size} or /width/{size} | |
/h/{size} or /height/{size} | |
/cover/{size} | |
aspect ratio maintained, ensure the image covers both provided dimensions by cropping/clipping to fit. | |
/contain/{size} | |
aspect ratio maintained, contain within both provided dimensions using "letterboxing". | |
/fill/{size} | |
Ignore the aspect ratio of the input and stretch to both provided dimensions. | |
--- | |
Image operations; | |
/rotate/{angle} | |
angle:number; will convert to a valid positive degree rotation. E.g., -450 will produce 270 | |
/flip/ | |
Flip the image about the vertical Y axis. This always occurs after rotation, if any. | |
The use of flip implies the removal of the EXIF Orientation tag, if any. | |
/flop/ | |
Flop the image about the horizontal X axis. This always occurs after rotation, if any. | |
The use of flop implies the removal of the EXIF Orientation tag, if any. | |
/flatten/{background}/ | |
Merge alpha transparency channel, if any, with a background, then remove the alpha channel. | |
background:string hex color without # or rgb-string (optional) e.g. '000000' or 'rgb(255, 255, 255)' | |
/linear/{a?}/{b?} | |
// TODO see https://github.com/libvips/libvips/issues/1741#issuecomment-663400937 | |
Levels adjustment of the ends | |
Apply the linear formula a * input + b to the image, see also gamma | |
a:number multiplier (optional, default 1.0) | |
b:number offset (optional, default 0.0) | |
/clahe/{width_height_maxSlope?} | |
This will, in general, enhance the clarity of the image by bringing out darker details. | |
a string; values separated by underscore: | |
width:number integer width of the region in pixels. | |
height:number integer height of the region in pixels. | |
maxSlope:number maximum value for the slope of the cumulative histogram: | |
// A value of 0 disables contrast limiting. Range 0-100 (inclusive) (optional, default 3) | |
/modulate/{brightness?_saturation?_hue?_lightness?} | |
Transforms the image using brightness, saturation, hue rotation, and lightness. | |
a string; values separated by underscore: | |
brightness:number Brightness multiplier (optional) | |
saturation:number Saturation multiplier (optional) | |
hue:number Degrees for hue rotation (optional) | |
lightness:number Lightness addend (optional) | |
/sharpen/{sigma?_flat?_jagged?}/ | |
When used without parameters, performs a fast, mild sharpen of the output image. | |
a string; values separated by underscore: | |
sigma:number the sigma of the Gaussian mask, where sigma = 1 + radius / 2 (optional) | |
flat:number the level of sharpening to apply to "flat" areas. (optional, default 1.0) | |
jagged::number the level of sharpening to apply to "jagged" areas. (optional, default 2.0) | |
/blur/{sigma}/ | |
When a sigma is provided, performs a slower, more accurate Gaussian blur. | |
sigma:number a value between 0.3 and 1000 representing the sigma of the Gaussian mask; sigma = 1+radius/2 (optional) | |
/median/{size}/ | |
Apply median filter. When used without parameters the default window is 3x3. | |
size:number (optional, default 3) | |
/gamma/{gamma}/{gammaOut}/ | |
Apply a gamma correction by reducing the encoding (darken) pre-resize at a factor of 1/gamma then increasing the encoding | |
(brighten) post-resize at a factor of gamma. This can improve the perceived brightness of a resized image in non-linear | |
colour spaces. JPEG and WebP input images will not take advantage of the shrink-on-load performance optimisation. | |
Supply a second argument to use a different output gamma value, otherwise the first value is used in both cases. | |
gamma:number value between 1.0 and 3.0. (optional, default 2.2) | |
gammaOut:number value between 1.0 and 3.0. (optional, default 2.2) | |
/negate/ | |
/negate/noalpha | |
Produce the "negative" of the image. | |
noalpha: Do not negate any alpha channel | |
/tint/{color} | |
color:string hex color without # or rgb-string (optional) e.g. '000000' or 'rgb(255, 255, 255)' | |
/desaturate/ | |
Convert to greyscale; shortcut for modulate/1_0 | |
/grayscale/ | |
/greyscale/ | |
Convert to 8-bit greyscale; 256 shades of grey. | |
/normalise/ | |
/normalize/ | |
Enhance output image contrast by stretching its luminance to cover the full dynamic range. | |
/withMetadata/{orientation}/ | |
Include all metadata (EXIF, XMP, IPTC) from the input image in the output image. | |
Will also convert to a web-friendly sRGB ICC profile unless a custom profile is provided. | |
orientation:number value between 1 and 8, used to update the EXIF Orientation tag (optional) | |
--- | |
in the end: | |
Output operations; | |
/jpg/{quality}/mozjpeg?_progressive?_optimiseScans?/ | |
/jpeg/{quality}/mozjpeg?_progressive?_optimiseScans?/ | |
/png/{quality}/progressive?/ | |
/webp/{quality}/lossless?_nearLossless?_smartSubsample?_loop?/ | |
/gif/{colors}/loop?/ | |
/avif/{quality}/lossless?/ | |
/heif/{quality}/lossless?/ | |
/tif/{bitdepth}/ | |
/tiff/{bitdepth}/ | |
[ | |
For output formats a string; values separated by underscore: quality:number and named flags | |
quality is 1 - 100 and colors is 2 - 256; e.g. /png/80/ or /jpg/80/progressive/ | |
The named flag 'loop' can be extended by {iterations}_{delay} (integers for count and milliseconds) | |
e.g. /gif/16/loop4_100 | |
] | |
More depend on libvips compilation: | |
The router will log all supported formats when starting. | |
See also https://sharp.pixelplumbing.com | |
*/ | |
/* TODO | |
make it a pluggable router | |
sharp.metadata(); | |
sharp.composite(); | |
sharp.cache(); | |
sharp.concurrency(); | |
-more output options- | |
png colours | |
jp2 tileWidth, tileHeight (+ options.tile = true) | |
tiff bitdepth? tileWidth, tileHeight (+ options.tile = true) | |
*/ | |
/* PORT FOR PROXY */ | |
const CORS_PROXY_PORT = 5000; | |
/* PORT FOR DEMO; can be overwritten by process.env.PORT */ | |
const PUBLIC_APP_PORT = 8080; | |
/* IF OUTPUT TYPE INVALID; or (depending on libvips) if input can't be used for output */ | |
const OUTPUT_FALLBACK = { | |
image: 'png' | |
}; | |
/* Resize parameters */ | |
const whShortcuts = { | |
thumb: 80, | |
preview: 240, | |
column: 600, | |
page: 1280, | |
hd: 1920 | |
}; | |
const containPosition = { | |
top: 'top', right: 'right', bottom: 'bottom', left: 'left', | |
righttop: 'right top', topright: 'right top', | |
rightbottom: 'right bottom', bottomright: 'right bottom', | |
leftbottom: 'left bottom', bottomleft: 'left bottom', | |
lefttop: 'left top', topleft: 'left top', | |
north: 'north', northeast: 'northeast', | |
east: 'east', southeast: 'southeast', | |
south: 'south', southwest: 'southwest', | |
west: 'west', northwest: 'northwest', | |
centre: 'centre', center: 'centre', middle: 'centre', | |
}; | |
const coverPosition = { | |
...containPosition, | |
entropy: 'entropy', | |
attention: 'attention' | |
}; | |
const sizeMethods = { | |
width: 'width', | |
w: 'width', | |
height: 'height', | |
h: 'height', | |
cover: 'cover', | |
contain: 'contain', | |
fill: 'fill' | |
}; | |
const sizeRegexStr = `${Object.keys(sizeMethods).join('|')}`; | |
/* Operations parameters */ | |
const operations = { | |
rotate: 'rotate', flip: 'flip', flop: 'flop', sharpen: 'sharpen', blur: 'blur', | |
median: 'median', flatten: 'flatten', gamma: 'gamma', negate: 'negate', linear: 'linear', | |
clahe: 'clahe', modulate: 'modulate', normalise: 'normalise', normalize: 'normalise', | |
tint: 'tint', desaturate: 'desaturate', grayscale: 'grayscale', greyscale: 'greyscale', | |
withmetadata: 'withMetadata' | |
}; | |
const operationRegexStr = `${Object.keys(operations).join('|')}`; | |
/* Output parameters */ | |
const outputSwitches = { | |
jpeg: { mozjpeg: 1, progressive: 1, optimiseScans: 1 }, | |
png: { progressive: 1 }, | |
webp: { lossless: 1, nearLossless: 1, smartSubsample: 1 }, | |
jp2: { lossless: 1 }, | |
avif: { lossless: 1 }, | |
heif: { lossless: 1 } | |
} | |
// NOTE: The supported input and output formats slightly depend on how libvips is compiled; | |
const { format } = sharp; | |
const inputs = {}; | |
const outputs = { jpg: 'jpeg', tif: 'tiff' }; | |
for (const [key, o] of Object.entries(format)) { | |
if (!!o.input && !!o.input.buffer) { inputs[key] = key } | |
if (!!o.output && !!o.output.buffer) { outputs[key] = key } | |
} | |
const outputRegexStr = `${Object.keys(outputs).join('|')}`; | |
console.log('Supported image/* type for input:', inputs); | |
console.log('Supported output:', outputs); | |
/** | |
* Construct the caching middleware | |
*/ | |
function cacheMiddleware() { | |
const cacheOptions = { | |
statusCodes: { include: [200] }, | |
defaultDuration: 60000, | |
appendKey: (req, res) => req.method | |
}; | |
let cacheMiddleware = apicache.options(cacheOptions).middleware(); | |
return cacheMiddleware; | |
} | |
// Create CORS Anywhere server | |
corsAnywhere.createServer({}).listen(CORS_PROXY_PORT, () => { | |
console.log( | |
`Internal CORS Anywhere server started at port ${CORS_PROXY_PORT}` | |
); | |
}); | |
// Create express Cache server | |
let app = express(); | |
// Register cache middleware for GET and OPTIONS verbs | |
app.get('/*', cacheMiddleware()); | |
app.options('/*', cacheMiddleware()); | |
function isValidNr(value) { | |
return typeof value === 'number' && !isNaN(value) | |
} | |
function toColor(value) { | |
return value.indexOf('rgb') > -1 ? value : `#${value}` | |
} | |
function parseSingleSize(size) { | |
const value = whShortcuts.hasOwnProperty(size) ? whShortcuts[size] : parseInt(size, 10); | |
if (isValidNr(value) && value > 0) { | |
return value; | |
} | |
return 0 | |
} | |
function vNumber(_v, defaultV = void 0) { | |
const v = parseFloat(_v); | |
return !isValidNr(value) ? defaultV : v | |
} | |
function vMinMax(_v, min, max, defaultV = void 0) { | |
const v = parseFloat(_v); | |
return !isValidNr(v) ? defaultV : Math.max(Math.min(max, v), min) | |
} | |
const argsFn = { | |
size: (sizeMethod, sizeOrShortcut, position = 'centre') => { | |
const o = {position}; | |
if (!sizeMethods.hasOwnProperty(sizeMethod)) { return o } | |
if (sizeMethod === 'width' || sizeMethod === 'height') { | |
o.fit = 'cover'; | |
o[sizeMethod] = parseSingleSize(sizeOrShortcut); | |
return o | |
} | |
o.fit = sizeMethod; | |
let [w,h] = sizeOrShortcut.split('x'); | |
w = parseSingleSize(w); | |
if (isValidNr(w) && w > 0) { | |
o.width = w; | |
const _h = parseSingleSize(h); | |
o.height = (isValidNr(_h) && _h > 0) ? _h : w; | |
} | |
const pos = position.toLowerCase(); | |
if (!!pos) { | |
if (options.fit === 'cover' && coverPosition.hasOwnProperty(pos)) { | |
options.position = coverPosition[pos]; | |
} else if (options.fit === 'contain' && containPosition.hasOwnProperty(pos)) { | |
options.position = containPosition[pos]; | |
} | |
} | |
return o | |
}, | |
operation: (op, value, optional) => { | |
if (!operations.hasOwnProperty(op)) { return [] } | |
const options = {}; | |
let operation = operations[op]; | |
// special shortcuts | |
if (operation === 'desaturate') { | |
operation = 'modulate'; | |
value = '1_0' | |
} | |
switch(operation) { | |
case 'rotate': | |
// angle; If provided, it is converted to a valid positive degree rotation. E.g., -450 will produce 270 | |
if (!value) { return [] } | |
if (optional) { options.background = optional } | |
return [vNumber(value), options]; | |
case 'blur': | |
// a value between 0.3 and 1000 representing the sigma of the Gaussian mask, where sigma = 1+radius/2 | |
if (!value) { return [] } | |
return [vMinMax(value, 0.3, 1000)]; | |
case 'sharpen': | |
// sigma_flat?_jagged? | |
// sigma number ? the sigma of the Gaussian mask, where sigma = 1 + radius / 2. | |
// flat number the level of sharpening to apply to "flat" areas. (optional, default 1.0) | |
// jagged number the level of sharpening to apply to "jagged" areas. (optional, default 2.0) | |
if (!value) { return [] } | |
const [sigma, flat = 1, jagged = 2] = value.split('_'); | |
return [vMinMax(sigma, 0.3, 1000), vNumber(flat), vNumber(jagged)]; | |
case 'linear': | |
// a:number multiplier (optional, default 1.0) | |
// b:number offset (optional, default 0.0) | |
if (!value) { return [] } | |
return [vMinMax(value, 1, 3), !!optional ? vMinMax(optional, 1, 3) : void 0]; | |
case 'tint': | |
if (!value) { return [] } | |
return [toColor(value)]; | |
case 'gamma': | |
// gamma number value between 1.0 and 3.0. (optional, default 2.2) | |
// gamma out number value between 1.0 and 3.0. (optional, default 2.2) | |
if (!value) { return [] } | |
return [vMinMax(value, 1, 3), !!optional ? vMinMax(optional, 1, 3) : void 0]; | |
case 'median': | |
// size x size (optional, default 3) | |
if (!value) { return [] } | |
return [vNumber(value)]; | |
case 'flatten': | |
// background? hex color | |
if (!value) { return [] } | |
options.background = toColor(value); | |
return [options]; | |
case 'negate': | |
// noalpha Do not negate any alpha channel | |
return value.toLowerCase() === 'noalpha' ? [{alpha: false}] : [] | |
case 'clahe': | |
// width_height_maxSlope? | |
// options.width number integer width of the region in pixels. | |
// options.height number integer height of the region in pixels. | |
// options.maxSlope number maximum value for the slope of the cumulative histogram: | |
// A value of 0 disables contrast limiting. Range 0-100 (inclusive) (optional, default 3) | |
if (!value) { return [] } | |
const [width, height, maxSlope = 3] = value.split('_'); | |
return [{ | |
width: vNumber(width), height: vNumber(height), maxSlope: vMinMax(maxSlope, 0, 100) | |
}]; | |
case 'modulate': | |
// brightness_saturation_hue_lightness | |
// options.brightness number ? Brightness multiplier | |
// options.saturation number ? Saturation multiplier | |
// options.hue number ? Degrees for hue rotation | |
// options.lightness number ? Lightness addend | |
if (!value) { return [] } | |
const [brightness, saturation, hue, lightness] = value.split('_'); | |
return [{ | |
brightness: vNumber(brightness), | |
saturation: vNumber(saturation), | |
hue: vNumber(hue), | |
lightness: vNumber(lightness) | |
}]; | |
case 'withMetadata': | |
// orientation; between 1 and 8, used to update the EXIF Orientation tag. | |
if (!value) { return [] } | |
return [vMinMax(value, 1, 8)]; | |
default: | |
return [true] | |
} | |
}, | |
output: (format, qualityOrColorsStr, _booleans = '') => { | |
if (!qualityOrColorsStr || !outputs.hasOwnProperty(format)) { return [] } | |
format = outputs[format]; | |
const minQ = format === 'gif' ? 2 : 1; | |
const maxQ = format === 'gif' ? 256 : 100; | |
const quality = parseInt(qualityOrColorsStr, 10); | |
const options = {quality: vMinMax(quality, minQ, maxQ, format === 'gif' ? 256 : 50)}; | |
if (format === 'tiff' && (_booleans === '1' || _booleans === '2' || _booleans === '4')) { | |
options.bitdepth = parseInt(_booleans, 10); | |
} else if ((format === 'gif' || format === 'webp') && _booleans.indexOf('loop') > -1) { | |
const [_ = false, _loop, _delay] = _booleans.match(/loop(\d*)?_?(\d*)?/i); | |
if (!!_) { | |
_booleans = _booleans.replace(_,''); | |
const [loop = 0, delay] = [parseInt(_loop, 10), parseInt(_delay, 10)]; | |
if (isValidNr(loop)) { options.loop = loop } | |
if (isValidNr(delay)) { options.delay = delay } | |
} | |
} | |
const booleans = _booleans.split('_'); | |
for (let boolKey of booleans) { | |
if (outputSwitches.hasOwnProperty(format) && outputSwitches[format].hasOwnProperty(boolKey)) { | |
options[boolKey] = true | |
} | |
} | |
return [options] | |
} | |
} | |
const toArray = (v) => typeof v === 'undefined' ? [] : (Array.isArray(v) ? v : [v]); | |
const dcTypes = { | |
Collection: 'Collection', Dataset: 'Document', Event: 'Event', Image: 'Image', | |
InteractiveResource: 'Object', MovingImage: 'Video', PhysicalObject: 'Object', | |
Service: 'Service', Software: 'Application', Sound: 'Audio', StillImage: 'Image', | |
Text: 'Note', Person: 'Person', Organization: 'Organization' | |
}; | |
const metaProperties = { | |
dc: { | |
contributor:[], coverage:'', creator:[], date:[], description:'', format:'', | |
identifier:'', language:[], publisher:[], relation:[], rights:'', source:'', | |
subject:[], title:'', type:[] | |
}, | |
tiff: { | |
// https://developer.adobe.com/xmp/docs/XMPNamespaces/tiff/ | |
// stored elsewhere in XMP: Artist, Copyright, ImageDescription | |
BitsPerSample:[], Compression:'', DateTime:'', ImageLength:0, | |
ImageWidth:0, Make:'', Model:'', Orientation:'', PhotometricInterpretation:'', | |
PlanarConfiguration:'', PrimaryChromaticities:[], ReferenceBlackWhite:[], | |
ResolutionUnit:'', SamplesPerPixel:0, Software:'', TransferFunction:[], | |
WhitePoint:[], XResolution:0, YResolution:0, YCbCrCoefficients:[], | |
YCbCrPositioning:'', YCbCrSubSampling:[] | |
}, | |
exif: { | |
ApertureValue:0, BrightnessValue:0, CFAPattern:{}, ColorSpace:'', | |
CompressedBitsPerPixel:0, Contrast:'', CustomRendered:'', DateTimeDigitized:'', | |
DateTimeOriginal:'', DeviceSettingDescription: {}, DigitalZoomRatio:0, ExifVersion:'', | |
ExposureBiasValue:0, ExposureIndex:0, ExposureMode:'', ExposureProgram:'', | |
ExposureTime:0, FileSource:'', Flash:{}, FlashEnergy:{}, FlashpixVersion:'', | |
FNumber:0, FocalLength:0, FocalLengthIn35mmFilm:0, FocalPlaneResolutionUnit:'', | |
FocalPlaneXResolution:0, FocalPlaneYResolution:0, GainControl:'', ImageUniqueID:'', | |
ISOSpeedRatings:[], LightSource:'', MaxApertureValue:0, MeteringMode:'', | |
OECF:{}, PixelXDimension:0, PixelYDimension:0, RelatedSoundFile:'', Saturation:'', | |
SceneCaptureType:'', SceneType:'', SensingMethod:'', Sharpness:'', ShutterSpeedValue:0, | |
SpatialFrequencyResponse:{}, SpectralSensitivity:'', SubjectArea:[], SubjectDistance:0, | |
SubjectDistanceRange:'', SubjectLocation:[], WhiteBalance:'', GPSAltitude:0, GPSAltitudeRef:'', | |
GPSAreaInformation:'', GPSDestBearing:0, GPSDestBearingRef:'', GPSDestDistance:0, | |
GPSDestDistanceRef:'', GPSDestLatitude:'', GPSDestLongitude:'', GPSDifferential:'', | |
GPSDOP:0, GPSImgDirection:0, GPSImgDirectionRef:'', GPSLatitude:'', GPSLongitude:'', | |
GPSMapDatum:'', GPSMeasureMode:'', GPSProcessingMethod:'', GPSSatellites:'', GPSSpeed:0, | |
GPSSpeedRef:'', GPSStatus:'', GPSTimeStamp:'', GPSTrack:0, GPSTrackRef:'', GPSVersionID:'', | |
ExposureProgram:'', ExposureCompensation:'', Flash: '', ISO:0, FocalLengthIn35mmFormat:0 | |
}, | |
photoshop: { | |
ColorMode:9, DocumentAncestors:[], History:'', ICCProfile:'', TextLayers:[], | |
AuthorsPosition: '', CaptionWriter:'', Category:'', City:'', Country:'', | |
Credit:'', DateCreated:'', Headline:'', Instructions:'', Source:'', State:'', | |
SupplementalCategories:[], TransmissionReference:'', Urgency:8 | |
}, | |
xmp: { | |
CreateDate:'', CreatorTool:'', Identifier:[], Label:'', MetadataDate:'', | |
ModifyDate:'', Rating:5, BaseURL:'', Nickname:'', Thumbnails:[] | |
}, | |
xmpRights: { | |
Certificate:'', Marked:true, Owner:[], UsageTerms:'', WebStatement:'' | |
}, | |
xmpMM: { | |
DerivedFrom:'', DocumentID:'', InstanceID:'', OriginalDocumentID:'', | |
RenditionClass:'', RenditionParams:'' | |
}, | |
Iptc4xmpCore: { | |
Location:'', CountryCode:'', IntellectualGenre:'', SubjectCode:[], Scene:[] | |
} | |
}; | |
const knownProperties = Object.entries(metaProperties).reduce((res, [vocabKey, o]) => { | |
for (const [key, type] of Object.entries(o)) { | |
res[key] = [vocabKey, type]; | |
} | |
return res | |
}, {}); | |
const getAStypes = (dcType = [], force = [], fallback = []) => { | |
const dcTypes = toArray(dcType).reduce((a,t) => { | |
return a.concat(dcTypes.hasOwnProperty(t) ? [dcTypes[t],`dc:${t}`] : [`dc:${t}`]) | |
}, []).filter((v) => !!v); | |
const types = toArray(force).concat(dcTypes); | |
return !!types.length ? types : fallback | |
} | |
const getDefault = (data, ldDefaultKey, iptcDefaultKey) => data[ldDefaultKey]; | |
const toNameNote = (name) => ({ type: ['Note'], name }); | |
const getLD = (data, href, mediaType, name = '') => { | |
// see https://developer.adobe.com/xmp/docs/XMPNamespaces/ | |
const dataLink = { | |
type: ['Link'], | |
href, | |
mediaType, | |
name | |
}; | |
if (typeof data !== 'object') { return {type: ['Image'], url: [dataLink]} } | |
const baseData = { | |
type: ['Image'], url: [dataLink], tag: [], result: [], context: [] | |
}; | |
// console.log(data); | |
const metaLD = Object.entries(data).reduce((o, [k, v]) => { | |
// old IPTC | |
[ | |
['description','Caption'], ['rights','CopyrightNotice'], ['creator','Byline'], | |
['TransmissionReference','OriginalTransmissionReference'], | |
['CaptionWriter','Writer'], ['Instructions','SpecialInstructions'] | |
].forEach((a, i) => { | |
const [ldKey, iptcKey] = a; | |
const hasLDdefault = (!data.hasOwnProperty(ldKey) || !data[ldKey]); | |
if (!hasLDdefault && data.hasOwnProperty(iptcKey)) { | |
data[ldKey] = data[iptcKey] | |
} | |
}); | |
const iptcCore1 = {Keywords: 'tag', EditStatus: 'result', FixtureIdentifier: 'context'}; | |
if (iptcCore1.hasOwnProperty(k)) { | |
if (typeof v === 'string') { | |
o[iptcCore1[k]] = { | |
type: k === 'Keywords' ? ['Note'] : ['Object', `redaktor:${k}`], | |
name: k === 'Keywords' ? v.split(',').map((s) => s.trim()) : v | |
} | |
} else if (k === 'Keywords' && Array.isArray(v)) { | |
o[iptcCore1[k]] = v.map(toNameNote); | |
} | |
return o | |
} | |
if (k === 'ExifImageWidth' || k === 'ExifImageHeight') { | |
o[k] = v | |
} | |
if (k === 'latitude' && data.hasOwnProperty('longitude')) { | |
if (!o.hasOwnProperty('location')) { o.location = {} }; | |
o.location = { type:['Place'], latitude: v, longitude: data.longitude, ...o.location }; | |
if (!o.location.hasOwnProperty('altitude') && data.hasOwnProperty('altitude')) { | |
o.location.altitude = data.altitude | |
} | |
if (!o.location.hasOwnProperty('radius') && data.hasOwnProperty('radius')) { | |
o.location.radius = data.radius | |
} | |
} | |
if (!knownProperties.hasOwnProperty(k)) { | |
return o | |
} | |
const [vocab, sample] = knownProperties[k]; | |
const key = `${vocab}:${k}`; | |
if (typeof v === 'object' && !Array.isArray(v) && | |
v.hasOwnProperty('lang') && v.lang === 'x-default' && | |
v.hasOwnProperty('value') | |
) { | |
o[key] = Array.isArray(sample) ? [v.value] : v.value; | |
return o | |
} | |
if (vocab === 'exif' && typeof sample === 'string' && typeof v === 'number') { | |
o[key] = v | |
} else if (Array.isArray(sample)) { | |
o[key] = toArray(v) | |
return o | |
} else if (Array.isArray(v)) { | |
v = v[0] | |
} | |
if (typeof sample === 'number') { | |
const int = vocab === 'exif' ? parseFloat(v) : parseInt(v, 10); | |
if (!!int && !isNaN(int) && int > (k === 'Rating' ? -1 : 0) && int <= sample) { | |
o[key] = int | |
} else if (sample === 0 && !!int && !isNaN(int)) { | |
o[key] = int | |
} | |
return o | |
} | |
if (typeof sample === typeof v || (v instanceof Date && typeof v.getMonth === 'function')) { | |
o[key] = v | |
} | |
return o | |
}, baseData); | |
const hasAnyProps = (...keys) => !!keys.map((prop) => metaLD.hasOwnProperty(prop)).filter((b) => !!b).length; | |
const multilang = (k, v, result = {}) => { | |
if (!v) { return [] } | |
if (typeof v === 'string') { | |
result[k] = v | |
} else { | |
result[`${k}Map`] = toArray(v).reduce((o, langO) => { | |
if (langO.hasOwnProperty('lang') && langO.hasOwnProperty('value')) { | |
o[langO.lang] = langO.value | |
} | |
return o | |
}, {}); | |
} | |
return [result] | |
} | |
if (hasAnyProps('Iptc4xmpCore:Location')) { | |
if (!hasAnyProps('location')) { metaLD.location = { type:['Place'], name: '', summary: '' } } | |
if (!metaLD.location.hasOwnProperty('name')) { | |
metaLD.location.name = metaLD['Iptc4xmpCore:Location'] | |
} else if (!metaLD.location.hasOwnProperty('summary')) { | |
metaLD.location.summary = metaLD['Iptc4xmpCore:Location'] | |
} | |
} | |
if (hasAnyProps('photoshop:City', 'photoshop:State', 'photoshop:Country', 'Iptc4xmpCore:CountryCode')) { | |
if (!hasAnyProps('location')) { metaLD.location = { type:['Place'], name: '', summary: '' } } | |
const getLocationFromSubproperties = () => { | |
const [City = '', State = '', Country = '', CCode = ''] = [ | |
metaLD['photoshop:City'], metaLD['photoshop:State'], | |
metaLD['photoshop:Country'], metaLD['Iptc4xmpCore:CountryCode'] | |
]; | |
return `in ${City}${!!City ? ', ' : ' '}${State}${!!State ? ', ' : ' '}`+ | |
`${Country}${!!CCode ? ' (' : ''}${CCode}${!!CCode ? ')' : ''}` | |
} | |
if (!metaLD.location.hasOwnProperty('name') || !metaLD.location.name.length) { | |
metaLD.location.name = getLocationFromSubproperties() | |
} else if (!metaLD.location.hasOwnProperty('summary') || !metaLD.location.summary.length) { | |
metaLD.location.summary = getLocationFromSubproperties() | |
} | |
} | |
if (hasAnyProps('location')) { | |
if (!metaLD.location.hasOwnProperty('altitude') && | |
metaLD.hasOwnProperty('exif:GPSAltitude')) { | |
if (typeof metaLD['exif:GPSAltitude'] === 'string') { | |
metaLD['exif:GPSAltitude'] = parseInt(metaLD['exif:GPSAltitude'], 10) | |
} | |
if (isValidNr(metaLD['exif:GPSAltitude'])) { | |
metaLD.location.altitude = metaLD.hasOwnProperty('exif:GPSAltitudeRef') && | |
(metaLD['exif:GPSAltitudeRef'] === 0 || metaLD['exif:GPSAltitudeRef'] === 'Below sea level') ? | |
(0 - metaLD['exif:GPSAltitude']) : metaLD['exif:GPSAltitude'] | |
} | |
} | |
} | |
const contextKeys = ['photoshop:TransmissionReference', 'Iptc4xmpCore:Scene']; | |
const tagKeys = ['Iptc4xmpCore:SubjectCode', 'photoshop:Category', 'photoshop:SupplementalCategories']; | |
if (hasAnyProps(...contextKeys)) { | |
contextKeys.forEach((ldKey) => { | |
if (!hasAnyProps(ldKey)) { return } | |
const [ldV = ''] = [metaLD[ldKey]]; | |
if (!!ldV) { | |
metaLD.context = (toArray(metaLD.context)||[]).concat(toArray(ldV).map((name) => | |
({ type: ['Object', `redaktor:${ldKey.split(':')[1]}`], name }))) | |
} | |
}) | |
} | |
if (hasAnyProps(...tagKeys)) { | |
tagKeys.forEach((ldKey) => { | |
if (!hasAnyProps(ldKey)) { return } | |
const [ldV = ''] = [metaLD[ldKey]]; | |
if (!!ldV) { | |
metaLD.tag = (toArray(metaLD.tag)||[]).concat(toArray(ldV).map(toNameNote)) | |
} | |
}) | |
} | |
if (hasAnyProps('dc:creator', 'photoshop:Credit', 'photoshop:Source', 'dc:rights')) { | |
if (!hasAnyProps('attributedTo')) { metaLD.attributedTo = { type: ['Group'] } } | |
metaLD.attributedTo = { type: ['Group'], name: [], summary: [], ...metaLD.attributedTo } | |
const [ | |
creator = [], Credit = '', Source = '', right = '', usage = '', webStatement = '', | |
headline = '', genre = '', subject = '', description = '' | |
] = [ | |
toArray(metaLD['dc:creator']), metaLD['photoshop:Credit'], metaLD['photoshop:Source'], | |
metaLD['dc:rights'], metaLD['xmpRights:UsageTerms'], metaLD['xmpRights:WebStatement'], | |
metaLD['photoshop:Headline'], metaLD['Iptc4xmpCore:IntellectualGenre'], | |
metaLD['dc:subject'], metaLD['dc:description'] | |
]; | |
if (!!creator.length) { | |
metaLD.attributedTo.name = (toArray(metaLD.attributedTo.name)||[]).concat(toArray(creator)) | |
} | |
if (!metaLD.attributedTo.hasOwnProperty('name') && (!!Credit || !!Source)) { | |
metaLD.attributedTo.name = `${Credit}${!!Credit && !!Source ? ' / ' : ''}${Source}` | |
} else if (!!Credit || !!Source) { | |
metaLD.attributedTo.summary = (toArray(metaLD.attributedTo.summary)||[]).concat( | |
toArray(`${Credit}${!!Credit && !!Source ? ' / ' : ''}${Source}`) | |
); | |
} | |
[right, usage].forEach((v) => { | |
if (!!v) { | |
const values = (multilang('content', v)||[]).map((r) => r.content); | |
metaLD.attributedTo.content = (toArray(metaLD.attributedTo.content)||[]).concat(values) | |
} | |
}); | |
[headline, genre].forEach((v) => { | |
if (!!v) { | |
const values = (multilang('name', v)||[]).map((r) => r.name); | |
metaLD.name = (toArray(metaLD.name)||[]).concat(values).filter((v) => !!v) | |
} | |
}); | |
if (!!subject) { | |
const values = (multilang('summary', subject)||[]).map((r) => r.summary); | |
metaLD.subject = (toArray(metaLD.subject)||[]).concat(values).filter((v) => !!v) | |
} | |
if (!!description) { | |
const values = (multilang('content', description)||[]).map((r) => r.content); | |
if (!metaLD.subject) { | |
metaLD.subject = values | |
} else { | |
metaLD.content = (toArray(metaLD.content)||[]).concat(values).filter((v) => !!v) | |
} | |
} | |
} | |
// TODO name = Make Model | |
const metaResult = {}; | |
const instrument = {type: ['Object']}; | |
for (let [k, v] of Object.entries(metaLD)) { | |
const [vocab, key] = k.split(':'); | |
if (vocab === 'exif' || vocab === 'tiff') { | |
instrument[k] = v; | |
} else { | |
metaResult[k] = v; | |
} | |
} | |
if (Object.keys(instrument).length > 1) { metaResult.instrument = instrument } | |
return metaResult; | |
/* | |
data = {type: ['Image'], tag: [], result: [], context: [], ...data}; | |
type | |
--- | |
https://developer.adobe.com/xmp/docs/XMPNamespaces/XMPDataTypes/ContactInfo/ | |
--- | |
url | |
published | updated | |
if (v instanceof Date && typeof v.getMonth === 'function' && typeof sample === 'string') { | |
o[key] = v.toISOString() | |
} | |
icon | |
preview | |
generator | |
attachment | |
duration | |
current | first | last | items | next | prev | partOf | endTime | startTime | startIndex | | |
totalItems | relationship | describes | formerType | deleted | |
'photoshop:Urgency': 1, | |
'xmp:Rating': 2, | |
'photoshop:Instructions': 'Anweisungen', | |
'photoshop:TransmissionReference': 'jobjennung', | |
'photoshop:CaptionWriter': 'Verfasser d. Beschr.', | |
*/ | |
} | |
const getMeta = async (data, reqUrl, mediaType, linkName = '', plainJSON = false) => { | |
const parsed = await exifr.parse(data, true); | |
if (parsed.hasOwnProperty('makerNote')) { delete parsed.makerNote } | |
for (let [key, value] of Object.entries(parsed)) { | |
if (value instanceof Uint8Array) { | |
const strValue = new TextDecoder().decode(value).replace(/^ASCII/,'').replace(/\0/g, '').trim(); | |
if (!!strValue) { | |
parsed[key] = strValue | |
} else { | |
delete parsed[key] | |
} | |
} | |
} | |
if (!!plainJSON) { return parsed } | |
const parsedLD = getLD(parsed, reqUrl, mediaType, linkName); | |
if (!parsedLD.name) { parsedLD.name = [path.basename(linkName)] } | |
return parsedLD | |
} | |
// Proxy to CORS server when request misses cache | |
/* | |
// method: width/w height/h cover contain fill | |
// options.withoutEnlargement | |
*/ | |
const methodsRegexStr = `${sizeRegexStr}|${operationRegexStr}|${outputRegexStr}`; | |
const methodsRegex = new RegExp(methodsRegexStr); | |
const ROUTE = [0,1,2,3,4,5,6].map((i) => !i ? `/:op0(${methodsRegexStr})/:a0?/:b0?` : | |
`/:op${i}?/:a${i}?/:b${i}?`).join('')+'/'; | |
app.use(ROUTE, expressHttpProxy(`localhost:${CORS_PROXY_PORT}`, { | |
preserveHostHdr: true, | |
userResDecorator: async (proxyRes, proxyResData, req, res) => { | |
const fallback = proxyResData; | |
try { | |
const reqUrl = `${req.protocol||'https'}://${req.get('host')}${req.originalUrl}`; | |
const proxyUrl = proxyRes.hasOwnProperty('socket') && proxyRes.socket.hasOwnProperty('_httpMessage') ? | |
proxyRes.socket._httpMessage.path||reqUrl : reqUrl; | |
const mediaType = (!res.get('content-type') ? 'image/png' : res.get('content-type')).toLowerCase(); | |
const [mainType, imageType] = mediaType.split('/'); | |
if (mainType !== 'image') { return fallback } | |
const outputFallback = ['output', outputs.hasOwnProperty(imageType) ? imageType : OUTPUT_FALLBACK.image]; | |
let newData = proxyResData; | |
let hasMetadata = false; | |
const paramGroups = Object.values(req.params).reduce((a,_v,i,pa) => { | |
if (!_v) { return a } | |
const v = _v.toLowerCase(); | |
if (i < pa.length-3) { | |
if (methodsRegex.test(v)) { | |
const _a = [v]; | |
if (!methodsRegex.test(pa[i+1])) { _a.push(pa[i+1]) } | |
if (!methodsRegex.test(pa[i+2])) { _a.push(pa[i+2]) } | |
return a.concat([_a]) | |
} | |
} else { | |
return a.concat([[v, pa[i+1], pa[i+2]]]); | |
} | |
return a | |
}, []).map((a) => { | |
const [v, ...args] = a.filter((s) => !!s); | |
if (v === 'withmetadata') { hasMetadata = true } | |
const [type, method] = sizeMethods.hasOwnProperty(v) ? ['size', sizeMethods[v]] : ( | |
operations.hasOwnProperty(v) ? ['operation', operations[v]] : ( | |
outputs.hasOwnProperty(v) ? ['output', outputs[v]] : [] | |
) | |
); | |
return [type, method, ...args] | |
}).filter((a) => !!a.length && !!a[0]); | |
if (!!paramGroups.length && paramGroups[paramGroups.length-1][0] !== 'output') { | |
paramGroups.push(outputFallback) | |
} | |
const acceptsJSON = !!req.accepts('application/json') || !!req.accepts('json'); | |
const isPlainJSONmeta = !req.accepts('application/ld+json') && acceptsJSON; | |
const meta = (!hasMetadata || (!req.accepts('application/ld+json') && !acceptsJSON)) ? {} : | |
await getMeta(newData, reqUrl, mediaType, proxyUrl, isPlainJSONmeta); | |
newData = await sharp(newData).timeout({seconds: 8}); | |
for (let a of paramGroups) { | |
const [type, methodOrOutput, ...args] = a; | |
const options = argsFn[type](methodOrOutput, ...args); | |
// console.log(type, methodOrOutput, options, (methodOrOutput in newData)); | |
if (type === 'size' && (!!options.width || !!options.height)) { | |
newData = newData.resize(options) | |
} | |
if (type === 'operation' && (methodOrOutput in newData)) { | |
const opArgs = options; | |
newData = newData[methodOrOutput](...opArgs) | |
} | |
if (type === 'output') { | |
newData = newData.toFormat(methodOrOutput, options) | |
} | |
} | |
const output = await newData.toBuffer(); | |
// const withMeta = await sharp(output).stats(); // dominant color … | |
const withMeta = await sharp(output).metadata(); | |
if (meta.url && !!meta.url.length) { | |
if (withMeta.width) { meta.url[0].width = withMeta.width } | |
if (withMeta.height) { meta.url[0].height = withMeta.height } | |
} | |
// console.log(meta, JSON.stringify(meta.context)); | |
// TODO - always cache output | |
if (!req.accepts('html') && (acceptsJSON || req.accepts('application/ld+json'))) { | |
// meta is already formatted accordingly | |
return JSON.stringify(meta); | |
} | |
// if (!req.accepts('html')) ---> 404 | |
return output | |
} catch(e) { | |
console.log('!error ', e); | |
return fallback | |
} | |
} | |
})); | |
const APP_PORT = process.env.PORT || PUBLIC_APP_PORT; | |
app.listen(APP_PORT, () => { | |
console.log(`External CORS cache server started at port ${APP_PORT}`); | |
}); |
might use any namespace prefix from default redaktor @context
exports.as = 'https://www.w3.org/ns/activitystreams';
exports.security = 'https://w3id.org/security/v1';
exports.vocab = { '@vocab': exports.as };
exports.wellKnownVocab = {
as: exports.as,
bibo: 'http://purl.org/ontology/bibo/',
dc: 'http://purl.org/dc/elements/1.1/',
dcat: 'http://www.w3.org/ns/dcat#',
dcterms: 'http://purl.org/dc/terms/',
dctype: 'http://purl.org/dc/dcmitype/',
exif: "http://ns.adobe.com/exif/1.0/",
eli: 'http://data.europa.eu/eli/ontology#',
foaf: 'http://xmlns.com/foaf/0.1/',
ical: 'http://www.w3.org/2002/12/cal/ical#',
Iptc4xmpCore: 'http://iptc.org/std/Iptc4xmpCore/1.0/xmlns/',
Iptc4xmpExt: 'http://iptc.org/std/Iptc4xmpExt/2008-02-29/',
ldp: 'http://www.w3.org/ns/ldp#',
og: 'http://ogp.me/ns#',
org: 'http://www.w3.org/ns/org#',
owl: 'http://www.w3.org/2002/07/owl#',
photoshop: 'http://ns.adobe.com/photoshop/1.0/',
rdf: 'http://www.w3.org/1999/02/22-rdf-syntax-ns#',
rdfa: 'http://www.w3.org/ns/rdfa#',
rdfs: 'http://www.w3.org/2000/01/rdf-schema#',
redaktor: 'https://purl.org/redaktor/namespace',
schema: 'http://schema.org/',
skos: 'http://www.w3.org/2004/02/skos/core#',
snomed: 'http://purl.bioontology.org/ontology/SNOMEDCT/',
tiff: 'http://ns.adobe.com/tiff/1.0/',
vcard: 'http://www.w3.org/2006/vcard/ns#',
vf: 'https://w3id.org/valueflows#',
void: 'http://rdfs.org/ns/void#',
xml: 'http://www.w3.org/XML/1998/namespace',
xmp: "http://ns.adobe.com/xap/1.0/",
xmpDM: 'http://ns.adobe.com/xmp/1.0/DynamicMedia/',
xmpMM: "http://ns.adobe.com/xap/1.0/mm/",
xmpRights: "http://ns.adobe.com/xap/1.0/rights/",
xsd: 'http://www.w3.org/2001/XMLSchema#'
}
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
samples e.g. in https://github.com/ianare/exif-samples
Image Output
http://localhost:8080/w/200/png/80/https://raw.githubusercontent.com/ianare/exif-samples/master/jpg/gps/DSCN0010.jpg
in the browser orPlain JSON Output
curl -i -H "Accept: application/json" http://localhost:8080/w/200/png/80/withMetadata/https://raw.githubusercontent.com/ianare/exif-samples/master/jpg/gps/DSCN0010.jpg?x=a
JSON LD (as) Output
curl -i -H "Accept: application/ld+json" http://localhost:8080/w/200/png/80/withMetadata/https://raw.githubusercontent.com/ianare/exif-samples/master/jpg/gps/DSCN0010.jpg?x=b
`
-->