This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
String str = "a吉野屋𠮷野\r\n屋ア緣イ?"; | |
// byte[] bytes = str.getBytes("Windows-31j"); | |
char alt = '□'; | |
final String charset = "Windows-31j"; | |
String filteredStr = filterLowSurrogate(str); | |
String translatedStr = new String(str.getBytes(charset), Charset.forName(charset)); | |
StringBuilder sb = new StringBuilder(filteredStr.length()); | |
for (int i = 0; i < filteredStr.length(); i++) { |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
/* | |
* To change this license header, choose License Headers in Project Properties. | |
* To change this template file, choose Tools | Templates | |
* and open the template in the editor. | |
*/ | |
package cn.orz.pascal.tinybench; | |
import java.io.IOException; | |
import java.nio.file.Path; | |
import java.text.ParseException; |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
public static void checkJavacOptimizationPlus() { | |
String hello = "hello(v)"; | |
String world = "world(v)"; | |
final String HELLO = "hello(c)"; | |
final String WOLRD = "world(c)"; | |
String mixedStr2 = "hello(l)" + "world(l)" + HELLO + WOLRD + hello + world; | |
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
docker run -d --name zookeeper -p 2181:2181 confluent/zookeeper | |
docker run -it --name kafka -p 9092:9092 --link zookeeper:zookeeper confluent/kafka | |
kafka-topics --create --zookeeper zookeeper:2181 --replication-factor 1 --partitions 1 --topic test | |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
apiVersion: argoproj.io/v1alpha1 | |
kind: Workflow | |
metadata: | |
generateName: wf-exec-ml- | |
spec: | |
entrypoint: exec-ml | |
volumes: | |
- name: google-cloud-key | |
secret: | |
secretName: btc-prediction-key |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
時は大メニーコア時代。我々はビックデータなんて持ってない。 | |
いや、正確には持っている。が、業務の全てがビックデータではない。 | |
普通のたかだか数百GBのデータのジョブを数千単位でマネージする必要がある。 | |
それが現実だ。ビックデータを効率的に処理する基盤では無く、ちょっとデカイデータをスマートにに管理することが必要だ。 | |
ビックデータ処理基盤はすばらしい。大量にのマシンを効率的に管理するためのエコシステムがある。 | |
ではそれで充分か? 否である。我々は大量のサーバーなんて必要としていない。 | |
今や1台で数十個のvCPUを持ったサーバは普通だ。大量のサーバではないので、それなりの信頼性は期待できる。 |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
require 'nokogiri' | |
def f node | |
node.children.map do |x| | |
[x.name, x.children.size, f(x)] | |
end | |
end | |
def nodes2text(nodes, level) | |
nodes.map { |n| |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
module TestOperations | |
def test(case_name) | |
@case_name = case_name | |
r = yield | |
if r != nil | |
if @cnt_failed == nil | |
@cnt_failed = 0 | |
end | |
@cnt_failed += 1 |
SparkのRDDみたく巨大ファイルを透過的に分割処理するAPIイメージ
Record.partition(10).read("hoge.txt").map(line -> line.split(","))
内部実装的には下記ように変換?
IntStream.range(1, 10).parallel().mapToObj( n -> {