Skip to content

Instantly share code, notes, and snippets.

View posix4e's full-sized avatar
🎯
Focusing

Alex Newman posix4e

🎯
Focusing
View GitHub Profile
TTT246 Individual Contributor License Agreement
In order to clarify the intellectual property license granted with Contributions from any person or entity, ttt246 must have a Contributor License Agreement ("CLA") on file that has been signed by each Contributor, indicating agreement to the license terms below. This license is for your protection as a Contributor as well as the protection of ttt246; it does not change your rights to use your own Contributions for any other purpose. You accept and agree to the following terms and conditions for Your present and future Contributions submitted to ttt246. Except for the license granted herein to ttt246 and recipients of software distributed by ttt246, You reserve all right, title, and interest in and to Your Contributions.
Definitions. "You" (or "Your") shall mean the copyright owner or legal entity authorized by the copyright owner that is making this Agreement with ttt246. For legal entities, the entity making a Contribution and all other entities that control,

Prompt

Given https://www.nytimes.com/2023/04/15/nyregion/shoplifting-arrests-nyc.html. Write a summary of the article. Include why "Nearly a third of all shoplifting arrests in New York City last year involved just 327 people, the police said. Collectively, they were arrested and rearrested more than 6,000 times." is a strange statistic. Include whether or not you can guess some misreporting or bad statistics?

Response

The article discusses the issue of shoplifting in New York City, highlighting that nearly a third of all shoplifting arrests in 2021 involved just 327 individuals who were arrested and rearrested more than 6,000 times. This strange statistic suggests that a small number of repeat offenders are responsible for a significant portion of shoplifting incidents in the city.

The high number of arrests and rearrests among this small group may indicate a lack of effective deterrents or support systems for these individuals, leading them to continue engaging in criminal behavior. It is also poss

[Scene: A courtroom in the land of Crypto, where Binance, the mighty exchange, stands before the SEC, the regulator supreme]
SEC:
Hail Binance, thou art brought before us today,
On charges of selling securities astray.
'Tis said thou hast a token called BUSD,
That's sold and traded on thy platform, thus deemed.
Binance:
Hail SEC, thou regulator of high renown,
please write a paper on why racism is hindering biology research
Introduction
Racism has been around for centuries, and its effects are still felt all over the world today. It has been a major obstacle to progress in many different fields, and one of those fields is biology research. In this paper, we will look at why racism is hindering biology research and the impact it has on the field. We will explore the history of racism in the biological sciences and the current prevalence of racism among scientists, as well as the implications of racism in biology research.
History of racism in the biological sciences
Racism has been entrenched in the biological sciences since its inception. Early scientists such as Carl Linnaeus and Charles Darwin used race as a way of classifying and categorizing humans in their work. This led to the development of pseudoscientific theories of inherent racial difference, which were used to justify racism and oppression. These theories were eventually debunked, but the legacy of
### Keybase proof
I hereby claim:
* I am posix4e on github.
* I am t00r (https://keybase.io/t00r) on keybase.
* I have a public key ASBSirC9CR8KOibvisSoOeq3qQnNPYUQc0U2iIQIrLokcAo
To claim this, I am signing this object:
HUMAN Protocol Foundation is an independent Foundation organized to support the development of the HUMAN Protocol, a new way to organize, recruit, and compensate labor, with a particular focus on machine learning applications.
The Foundation directly supports the enhancement of software related to the HUMAN Protocol and incurs other costs associated with encouraging the adoption of the Protocol.
Financial awards or grants for technical achievements will be contributed from Foundation funds and its partners.
In addition, the Foundation may choose to give 0L rewards to individual miners on the HUMAN network, proportional to the volume of their activity.
Contributions to the HUMAN Protocol Foundation are accepted at the 0L account:
E0484EA7FCF0694F79E484FC684E6C2C
Contributions will be used to fund awards of the Foundation. For professional administration of the resources of the program the Foundation will retain an administrative fee of 5% of the funds awarded to community participants.
Keys will be prefixed by the name of the processing strategy
Input keys will begin with the prefix input. And all keys nessecary ....
Output keys are keys meant to be dumped to s3, and are written once. Aka, you listen to output dump and delete it.
This flexabiliy will mean we can process with cron, events or listeners.
Temp keys are keys that are being written to as a job is being completed, and have values, meant to be extended,
but data is never removed from a temp key. A good example of a temp key is the list of captchas (ekeys) associated
with a task when the recording oracle detects the task is complete, it renames it from temp to output for dumping.

Keybase proof

I hereby claim:

  • I am posix4e on github.
  • I am posix4e (https://keybase.io/posix4e) on keybase.
  • I have a public key ASBsVsa831RtYBdqeFZdlYpjrpMNtmL6Gs62J0WnqKNPzQo

To claim this, I am signing this object:

storage_prod=# explain SELECT name,cpu_hostname,extended,updated,created,concat(ST_Y(ST_Centroid(geom)),',',ST_X(ST_Centroid(geom))) as location FROM scenes WHERE bucket_id=30 AND EXTRACT(EPOCH FROM created) > EXTRACT(EPOCH FROM '2015-11-19'::timestamp) ORDER BY created limit 10;
QUERY PLAN
---------------------------------------------------------------------------------------------------------------
Limit (cost=0.57..81.44 rows=10 width=819)
-> Index Scan using scene_bucket_id_created_idx on scenes (cost=0.57..26376379.94 rows=3261661 width=819)
Index Cond: (bucket_id = 30)
Filter: (date_part('epoch'::text, created) > 1447891200::double precision)
(4 rows)
ndexes:
sudo perl -pi -e 's|\x74\x11\x83\xF8\x3C|\x74\x11\x83\xF8\x3F|g' /Volumes/USB/mach_kernel
sudo perl -pi -e 's|\xEB\x0A\x83\xF8\x3A|\xEB\x0A\x83\xF8\x3F|g' /Volumes/USB/mach_kernel