- TÜRK (YÖRÜK , TATAR , ANADOLU TÜRKMENİ - MANAV-) 2
- BALKAN MUHACİRİ(BULGAR , BOŞNAK , POMAK , MAKEDON , ARNAVUT , KOSOVALI , BALKAN TÜRKMENİ)) 1
- DOĞU ANADOLU / ERMENİ / AZERİ 1
- TÜRK (YÖRÜK , TATAR , ANADOLU TÜRKMENİ - MANAV-) 1
// based on this blog: https://effectivetypescript.com/2020/04/09/jsonify/ | |
type Invalid = undefined | Function | symbol; | |
export type Jsonify<T> = T extends { toJSON(): infer U } | |
? U | |
: T extends BigInt | Invalid | |
? never | |
: T extends Number | |
? number |
#!/usr/bin/env node | |
const npx = require('libnpx') | |
const path = require('path') | |
const packageName = "figlet" | |
const NPM_PATH = path.join(__dirname, 'node_modules', 'npm', 'bin', 'npm-cli.js') | |
npx._ensurePackages(packageName, { npm: NPM_PATH }) | |
.then(results => { | |
const package = require(path.join(results.prefix, 'lib', 'node_modules', packageName)) |
import { Link as RemixLink } from "@remix-run/react"; | |
import React from "react"; | |
type RemixLinkProps = Parameters<typeof RemixLink>[0]; | |
export type LinkProps = { href: RemixLinkProps["to"] } & Omit< | |
RemixLinkProps, | |
"to" | |
>; |
import db from "./db" | |
export async function get({ queryParams }) { | |
try { | |
const user = await db.getUser(queryParams.userName); | |
if (user) { | |
return { | |
status: 200, | |
body: { | |
user, |
module Main where | |
import Prelude | |
import Data.Enum | |
import Data.Maybe | |
import Data.List | |
import Effect | |
import TryPureScript |
I remember thinking that the way we're doing JavaScript is complex but we don't have any choice. What we've been doing for the last few years is that we are downloading a lot of JavaScript modules from npm
to ournode_modules
folder and we transform and bundle it for browsers using webpack
and babel
. This was necessary because browsers didn't have support for new features, most importantly module support and sending a lot of separate files to the browser was inefficient, so we transformed and bundlead ahead-of-time.
Now the times are changing. Many popular browsers support the crucial features including module support and HTTP/2 makes it more efficient to send a bunch of files. But we're stuck with the old ways now, and we're paying the price for what made sense at the time. As it turns out putting al your JavaScript in one bundle is not that efficient either, since you're sending non essential code which makes it load and parse slower thus affecting the us
I'm currently running an experiment. I'm building a non-trivial web app using JavaScript, which should work with no missing functionality if JavaScript is disabled. That's an undertaking for sure, but it's how it used to be in the olden times. I just wanted to try to build a product using the web fundamentals where JavaScript would only enhance the experience and not be a requirement. Call it graceful degradation taken to the extreme.
How did I do it? My framework of choice was Next.js, which was actually an inspiration to this experiment. With Next.js you make pages using React.js and Next.js handles everything for you. What do I mean by everything? When you make a request to the server, Next.js gathers the required data to render the page and sends the HTML along with the required JavaScript and it is rehydrated on the client. When you navigate to another page, something magical happens. The Next.js runtime actually fetches the scripts required for the navigated
// @flow | |
import React, { Component } from "react" | |
import { isIP, isURL, isEmail, isPort } from "validator" | |
import { allValid, getErrors } from "./validation" | |
import { makeAPICall } from "./api" | |
// a flat data structure for viewing in the UI | |
type Inputs = { | |
hasAuth: boolean, |
I hereby claim:
To claim this, I am signing this object: