Skip to content

Instantly share code, notes, and snippets.

@CharmedSatyr
Last active August 1, 2018 15:01
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save CharmedSatyr/c2280f7c5fb8cdaf20700d57daf1155a to your computer and use it in GitHub Desktop.
Save CharmedSatyr/c2280f7c5fb8cdaf20700d57daf1155a to your computer and use it in GitHub Desktop.
Interview Question// JS Bin// source https://jsbin.com/wurebac
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width">
<title>JS Bin</title>
</head>
<body>
<script id="jsbin-javascript">
/*
Please write a JavaScript function that takes an array of objects and
returns a unique-ified version of the same array.
To be clear, the goal is just to remove any duplicate objects that
reference the same object in memory. The result can have multiple objects
with the same properties as long as they are different objects in memory.
There are multiple ways to solve this, but the goal is to create an
algorithm that is as fast as possible. You may whatever resources you want
without directly consuming a third party library.
*/
// MY ANSWER
// `Set` objects are collections of unique values formed from an iterable
// object argument. The `dedup` function makes a pre-unique-ified `Set`
// from `arr` and converts it to an array with the `Array.from()` method.
// We might include refinements like error handling for invalid types.
const dedup = arr => {
if (Array.isArray(arr)) {
return Array.from(new Set(arr));
} else {
throw new Error('dedup: Argument must be an array');
}
};
// FALLBACK ANSWER
// This method might be more appropriate for older browsers. Iterate
// through the `arr`, pushing current elements that have not already been
// included in the accumulator array. `indexOf` uses a strict equality
// check that distinguishes visually identical objects from multiple
// references to the same object.
function reduceToUniques(arr) {
if (Array.isArray(arr)) {
return arr.reduce(function(acc, curr) {
if (acc.indexOf(curr) < 0) {
acc.push(curr);
}
return acc;
}, []);
} else {
throw new Error('reduceToUniques: Argument must be an array');
}
};
// TESTS
// We might want to test for other things that weren't in the spec, like:
// 1) The function returns an array if passed one,
// 2) the function throws an error if passed an invalid argument,
// 3) the function removes non-object duplicates,
// etc., but we'll set those issues aside here.
// Test 1: The function should remove objects that reference the
// same object in memory.
const a = { orange: 'the new black' };
const b = a;
const arr1 = [a, b]; // a and b reference the same object in memory
if (dedup(arr1).length === 1 && dedup(arr1)[0] === a) {
console.log('Test 1: remove identical references -> passed');
} else {
console.error('Test 1: remove identical references -> failed');
}
// Test 2: The function can ignore multiple objects with the same
// properties as long as they are different objects in memory.
const c = { casper: 'the friendly ghost' };
const d = { casper: 'the friendly ghost' };
const e = { casper: 'the friendly ghost' };
// Three different objects with identical keys and values
const arr2 = [c, d, e];
if (dedup(arr2).length === arr2.length &&
arr2.every((v, i) => v === dedup(arr2)[i])
) {
console.log('Test 2: ignore visually similar properties -> passed');
} else {
console.error('Test 2: ignore visually similar properties -> failed');
}
// Test 3: The function should be as fast as possible.
// For this, I will compare the performances of `dedup` and
// `reduceToUniques` above. (Assume `reduceToUniques` passes the above
// sorts of tests.) My answer will pass if it is at least as fast as the
// competition and will fail otherwise. Run the bin a few times to see
// which algorithm is snappier!
// The test argument is an array of 250 references to the same object,
// requiring lots of deduplication.
const f = { num: 1 };
const arr3 = Array(250).fill(f);
// `dedup` timer
const t0 = performance.now();
dedup(arr3);
const t1 = performance.now();
const t = t1 - t0; // Total time elapsed
// fallback function timer
const fbt0 = performance.now();
reduceToUniques(arr3);
const fbt1 = performance.now();
const fbt = fbt1 - fbt0;
if (t <= fbt) {
console.log('Test 3: execution speed comparison -> passed');
} else {
console.error('Test 3: execution speed comparison -> failed');
}
// Note that `performance.now()` is not precise for security reasons, but
// I had trouble getting `console.time()` to work with JS Bin. On a local
// machine using `console.time()`, `dedup` averaged a speed of 0.0205ms
// over 6 tests, while `reduceToUniques` averaged 0.0413ms over 6 tests.
</script>
<script id="jsbin-source-javascript" type="text/javascript">/*
Please write a JavaScript function that takes an array of objects and
returns a unique-ified version of the same array.
To be clear, the goal is just to remove any duplicate objects that
reference the same object in memory. The result can have multiple objects
with the same properties as long as they are different objects in memory.
There are multiple ways to solve this, but the goal is to create an
algorithm that is as fast as possible. You may whatever resources you want
without directly consuming a third party library.
*/
// MY ANSWER
// `Set` objects are collections of unique values formed from an iterable
// object argument. The `dedup` function makes a pre-unique-ified `Set`
// from `arr` and converts it to an array with the `Array.from()` method.
// We might include refinements like error handling for invalid types.
const dedup = arr => {
if (Array.isArray(arr)) {
return Array.from(new Set(arr));
} else {
throw new Error('dedup: Argument must be an array');
}
};
// FALLBACK ANSWER
// This method might be more appropriate for older browsers. Iterate
// through the `arr`, pushing current elements that have not already been
// included in the accumulator array. `indexOf` uses a strict equality
// check that distinguishes visually identical objects from multiple
// references to the same object.
function reduceToUniques(arr) {
if (Array.isArray(arr)) {
return arr.reduce(function(acc, curr) {
if (acc.indexOf(curr) < 0) {
acc.push(curr);
}
return acc;
}, []);
} else {
throw new Error('reduceToUniques: Argument must be an array');
}
};
// TESTS
// We might want to test for other things that weren't in the spec, like:
// 1) The function returns an array if passed one,
// 2) the function throws an error if passed an invalid argument,
// 3) the function removes non-object duplicates,
// etc., but we'll set those issues aside here.
// Test 1: The function should remove objects that reference the
// same object in memory.
const a = { orange: 'the new black' };
const b = a;
const arr1 = [a, b]; // a and b reference the same object in memory
if (dedup(arr1).length === 1 && dedup(arr1)[0] === a) {
console.log('Test 1: remove identical references -> passed');
} else {
console.error('Test 1: remove identical references -> failed');
}
// Test 2: The function can ignore multiple objects with the same
// properties as long as they are different objects in memory.
const c = { casper: 'the friendly ghost' };
const d = { casper: 'the friendly ghost' };
const e = { casper: 'the friendly ghost' };
// Three different objects with identical keys and values
const arr2 = [c, d, e];
if (dedup(arr2).length === arr2.length &&
arr2.every((v, i) => v === dedup(arr2)[i])
) {
console.log('Test 2: ignore visually similar properties -> passed');
} else {
console.error('Test 2: ignore visually similar properties -> failed');
}
// Test 3: The function should be as fast as possible.
// For this, I will compare the performances of `dedup` and
// `reduceToUniques` above. (Assume `reduceToUniques` passes the above
// sorts of tests.) My answer will pass if it is at least as fast as the
// competition and will fail otherwise. Run the bin a few times to see
// which algorithm is snappier!
// The test argument is an array of 250 references to the same object,
// requiring lots of deduplication.
const f = { num: 1 };
const arr3 = Array(250).fill(f);
// `dedup` timer
const t0 = performance.now();
dedup(arr3);
const t1 = performance.now();
const t = t1 - t0; // Total time elapsed
// fallback function timer
const fbt0 = performance.now();
reduceToUniques(arr3);
const fbt1 = performance.now();
const fbt = fbt1 - fbt0;
if (t <= fbt) {
console.log('Test 3: execution speed comparison -> passed');
} else {
console.error('Test 3: execution speed comparison -> failed');
}
// Note that `performance.now()` is not precise for security reasons, but
// I had trouble getting `console.time()` to work with JS Bin. On a local
// machine using `console.time()`, `dedup` averaged a speed of 0.0205ms
// over 6 tests, while `reduceToUniques` averaged 0.0413ms over 6 tests.</script></body>
</html>
/*
Please write a JavaScript function that takes an array of objects and
returns a unique-ified version of the same array.
To be clear, the goal is just to remove any duplicate objects that
reference the same object in memory. The result can have multiple objects
with the same properties as long as they are different objects in memory.
There are multiple ways to solve this, but the goal is to create an
algorithm that is as fast as possible. You may whatever resources you want
without directly consuming a third party library.
*/
// MY ANSWER
// `Set` objects are collections of unique values formed from an iterable
// object argument. The `dedup` function makes a pre-unique-ified `Set`
// from `arr` and converts it to an array with the `Array.from()` method.
// We might include refinements like error handling for invalid types.
const dedup = arr => {
if (Array.isArray(arr)) {
return Array.from(new Set(arr));
} else {
throw new Error('dedup: Argument must be an array');
}
};
// FALLBACK ANSWER
// This method might be more appropriate for older browsers. Iterate
// through the `arr`, pushing current elements that have not already been
// included in the accumulator array. `indexOf` uses a strict equality
// check that distinguishes visually identical objects from multiple
// references to the same object.
function reduceToUniques(arr) {
if (Array.isArray(arr)) {
return arr.reduce(function(acc, curr) {
if (acc.indexOf(curr) < 0) {
acc.push(curr);
}
return acc;
}, []);
} else {
throw new Error('reduceToUniques: Argument must be an array');
}
};
// TESTS
// We might want to test for other things that weren't in the spec, like:
// 1) The function returns an array if passed one,
// 2) the function throws an error if passed an invalid argument,
// 3) the function removes non-object duplicates,
// etc., but we'll set those issues aside here.
// Test 1: The function should remove objects that reference the
// same object in memory.
const a = { orange: 'the new black' };
const b = a;
const arr1 = [a, b]; // a and b reference the same object in memory
if (dedup(arr1).length === 1 && dedup(arr1)[0] === a) {
console.log('Test 1: remove identical references -> passed');
} else {
console.error('Test 1: remove identical references -> failed');
}
// Test 2: The function can ignore multiple objects with the same
// properties as long as they are different objects in memory.
const c = { casper: 'the friendly ghost' };
const d = { casper: 'the friendly ghost' };
const e = { casper: 'the friendly ghost' };
// Three different objects with identical keys and values
const arr2 = [c, d, e];
if (dedup(arr2).length === arr2.length &&
arr2.every((v, i) => v === dedup(arr2)[i])
) {
console.log('Test 2: ignore visually similar properties -> passed');
} else {
console.error('Test 2: ignore visually similar properties -> failed');
}
// Test 3: The function should be as fast as possible.
// For this, I will compare the performances of `dedup` and
// `reduceToUniques` above. (Assume `reduceToUniques` passes the above
// sorts of tests.) My answer will pass if it is at least as fast as the
// competition and will fail otherwise. Run the bin a few times to see
// which algorithm is snappier!
// The test argument is an array of 250 references to the same object,
// requiring lots of deduplication.
const f = { num: 1 };
const arr3 = Array(250).fill(f);
// `dedup` timer
const t0 = performance.now();
dedup(arr3);
const t1 = performance.now();
const t = t1 - t0; // Total time elapsed
// fallback function timer
const fbt0 = performance.now();
reduceToUniques(arr3);
const fbt1 = performance.now();
const fbt = fbt1 - fbt0;
if (t <= fbt) {
console.log('Test 3: execution speed comparison -> passed');
} else {
console.error('Test 3: execution speed comparison -> failed');
}
// Note that `performance.now()` is not precise for security reasons, but
// I had trouble getting `console.time()` to work with JS Bin. On a local
// machine using `console.time()`, `dedup` averaged a speed of 0.0205ms
// over 6 tests, while `reduceToUniques` averaged 0.0413ms over 6 tests.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment