Skip to content

Instantly share code, notes, and snippets.

View azhawkes's full-sized avatar

Andy Hawkes azhawkes

View GitHub Profile
@azhawkes
azhawkes / SwtImageSprite.java
Created December 20, 2012 19:01
Simple Java/SWT class for image sprites. Quickly slice up a larger image into smaller ones, while preserving alpha transparency. Most of the existing examples out there don't handle alphas properly.
package com.andyhawkes.gists;
import org.eclipse.swt.graphics.Image;
import org.eclipse.swt.graphics.ImageData;
import org.eclipse.swt.graphics.Rectangle;
import org.eclipse.swt.widgets.Display;
/**
* Simple class that demonstrates how to slice up regions of a sprite image in SWT, while preserving
* alpha transparency. There are shorter ways to do this if you don't care about alpha transparency.
@azhawkes
azhawkes / WebConfig.groovy
Created August 19, 2015 17:26
Spring Boot (RestController) - support for application/octet-stream using InputStream
/**
* Adds support for application/octet-stream through a RestController using streams.
*/
@Configuration
class WebConfig extends WebMvcConfigurationSupport {
@Override
protected void configureMessageConverters(List<HttpMessageConverter<?>> converters) {
converters.add(new AbstractHttpMessageConverter<InputStream>(MediaType.APPLICATION_OCTET_STREAM) {
protected boolean supports(Class<?> clazz) {
return InputStream.isAssignableFrom(clazz)
@azhawkes
azhawkes / SnakeCaseApplicationConfiguration.java
Created October 15, 2015 15:32
Spring Boot: convert inbound parameters from snake_case to camelCase
@Configuration
public class SnakeCaseApplicationConfiguration {
@Bean
public OncePerRequestFilter snakeCaseConverterFilter() {
return new OncePerRequestFilter() {
@Override
protected void doFilterInternal(HttpServletRequest request, HttpServletResponse response, FilterChain filterChain) throws ServletException, IOException {
final Map<String, String[]> parameters = new ConcurrentHashMap<>();
for (String param : request.getParameterMap().keySet()) {
@azhawkes
azhawkes / spider.sh
Created January 13, 2014 18:00
Really simple wget spider to obtain a list of URLs on a website, by crawling n levels deep from a starting page.
#!/bin/bash
HOME="http://www.yourdomain.com/some/page"
DOMAINS="yourdomain.com"
DEPTH=2
OUTPUT="./urls.csv"
wget -r --spider --delete-after --force-html -D "$DOMAINS" -l $DEPTH "$HOME" 2>&1 \
| grep '^--' | awk '{ print $3 }' | grep -v '\. \(css\|js\|png\|gif\|jpg\)$' | sort | uniq > $OUTPUT