Skip to content

Instantly share code, notes, and snippets.

@pdxjohnny
Created November 13, 2023 16:08
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save pdxjohnny/8b8888484fa432cc0a1aa7e2033de6b9 to your computer and use it in GitHub Desktop.
Save pdxjohnny/8b8888484fa432cc0a1aa7e2033de6b9 to your computer and use it in GitHub Desktop.
Alice Engineering Comms Discussion Dump 2023-11-13T16:07:30+00:00
This file has been truncated, but you can view the full file.
{
"body": "These are the engineering logs of entities working on Alice. If you\r\nwork on [Alice](https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice/0000_architecting_alice#what-is-alice) please use this thread as a way to communicate to others\r\nwhat you are working on. Each day has a log entry. Comment with your\r\nthoughts, activities, planning, etc. related to the development of\r\nAlice, our open source artificial general intelligence.\r\n\r\nThis thread is used as a communication mechanism for engineers so that\r\nothers can have full context around why entities did what they did\r\nduring their development process. This development lifecycle data helps\r\nus understand more about why decisions were made when we re-read the\r\ncode in the future (via cross referencing commit dates with dates in\r\nengineering logs). In this way we facilitate communication across\r\nboth time and space! Simply by writing things down. We live an an\r\nasynchronous world. Let's communicate like it.\r\n\r\nWe are collaboratively documenting strategy and implementation as\r\nliving documentation to help community communicate amongst itself\r\nand facilitate sync with potential users / other communities /\r\naligned workstreams.\r\n\r\n- Game plan\r\n - [Move fast, fix things](https://hbr.org/2019/01/the-era-of-move-fast-and-break-things-is-over). [Live off the land](https://www.crowdstrike.com/cybersecurity-101/living-off-the-land-attacks-lotl/). [Failure is not an option](https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice/0000_architecting_alice#the-scary-part).\r\n- References\r\n - [Video: General 5 minute intro to Alice](https://www.youtube.com/watch?v=THKMfJpPt8I?t=129&list=PLtzAOVTpO2jYt71umwc-ze6OmwwCIMnLw)\r\n - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_forward.md\r\n - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_preface.md\r\n - https://gist.github.com/07b8c7b4a9e05579921aa3cc8aed4866\r\n - Progress Report Transcripts\r\n - https://github.com/intel/dffml/tree/alice/entities/alice/\r\n - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice/0000_architecting_alice#what-is-alice\r\n - https://github.com/intel/dffml/pull/1401\r\n - https://github.com/intel/dffml/pull/1207\r\n - https://github.com/intel/dffml/pull/1061\r\n - #1315\r\n - #1287\r\n - Aligned threads elsewhere (in order of appearance)\r\n - @dffml\r\n - [DFFML Weekly Sync Meeting Minutes](https://docs.google.com/document/d/16u9Tev3O0CcUDe2nfikHmrO3Xnd4ASJ45myFgQLpvzM/edit)\r\n - Alice isn't mentioned here but the 2nd party work is, Alice will be our maintainer who helps us with the 2nd party ecosystem.\r\n - #1369\r\n - @pdxjohnny\r\n - https://twitter.com/pdxjohnny/status/1522345950013845504\r\n - https://mastodon.social/@pdxjohnny/109320563491316354\r\n - [Google Drive: AliceIsHere](https://drive.google.com/drive/folders/1E8tZT15DNjd13jVR6xqsblgLvwTZZo_f)\r\n - Supplementary resources, status update slide decks, cards (with instructions of how to get more printed), Entity Analysis Trinity drawio files, screenshots. Miscellaneous other files.\r\n - async comms / asynchronous communications\r\n - https://twitter.com/SergioRocks/status/1579110239408095232\r\n\r\n## Engineering Log Process\r\n\r\n- Alice, every day at 7 AM in Portland's timezone create a system context (the tick)\r\n - Merge with existing system context looked up from querying this thread if exists\r\n - In the future we will Alice will create and update system contexts.\r\n - We'll start with each day, then move to a week, then a fortnight, then 2 fortnights.\r\n - She'll parse the markdown document to rebuild the system context as if it's cached\r\n right before it would be synthesized to markdown, we then run updates and trigger\r\n update of the comment body. Eventually we won't use GitHub and just DID based stuff.\r\n We'll treat these all as trains of thought / chains of system contexts / state of the\r\n art fields.\r\n - Take a set of system contexts as training data\r\n - The system context which we visualize as a line dropped from the peak of a pyramid\r\n where it falls through the base.\r\n - We use cross domain conceptual mapping to align system contexts in a similar direction\r\n and drop ones which do are unhelpful, do not make the classification for \"good\"\r\n - What remains from our circular graph is a pyramid with the correct decisions\r\n (per prioritizer) \r\n - This line represents the \"state of the art\", the remembered (direct lookup) or\r\n predicted/inferred system contexts along this line are well rounded examples of\r\n where the field is headed, per upstream and overlay defined strategic plans\r\n and strategic principles\r\n - References:\r\n - `$ git grep -C 5 -i principle -- docs/arch/`\r\n - Source: https://github.com/intel/dffml/discussions/1369\r\n - Inputs\r\n - `date`\r\n - Type: `Union[str, Date]`\r\n - Example: `2022-07-18`\r\n - Default: `did:oa:architype:current-date`\r\n - Add yesterdays unfinished TODO items to this train of though with the \r\n - Create a document (docutils?)\r\n - Make the top level header `date` with \"Notes\" appended\r\n - Collect all previous days TODOs from within the individual entity comments within the thread for the days comment (the team summary for that day)\r\n - Drop any completed or struck through TODOs\r\n - Output a list item \"TODO\" with the underlying bullets with handle prepended and then the TODO contents\r\n - Create comments for individuals after this the current system context is posted and we have a live handle to it to reply with each individuals markdown document.\r\n - Synthesis the document to markdown (there is a python lib out there that can do docutils to md, can't remember the name right now)\r\n - Upsert comment in thread\r\n- Any time (it's a game)\r\n - We find something we need to complete #1369 that was just completed by someone else\r\n - Paste `![chaos-for-the-chaos-god](https://github.com/intel/dffml/assets/5950433/636969a1-1f0f-4c96-8812-f10fa403e79c)`\r\n - chaos: diversity of authorship applied to clustering model run on all thoughts (dataflows) where each dataflow maps to line changed and the cluster classification is the author.\r\n - https://github.com/intel/dffml/issues/1315#issuecomment-1066814280\r\n - We accelerate a train of thought\r\n - Paste \ud83d\udee4\ufe0f\r\n - We use manifests or containers\r\n - Paste all the things meme `![oci-all-the-things](https://user-images.githubusercontent.com/5950433/222979759-0dd374b2-ee5f-4cbc-92d1-5cb8de078ee8.png)`\r\n - We find more ways to on-ramp data to the hypergraph\r\n - `![knowledge-graphs-for-the-knowledge-god](https://user-images.githubusercontent.com/5950433/222981558-0b50593a-c83f-4c6c-9aff-1b553403eac7.png)`\r\n - We do a live demo\r\n - `![live-demo-for-the-live-demo-god](https://user-images.githubusercontent.com/5950433/226699339-45b82b38-a7fc-4f2f-a858-e52ee5a6983d.png)`\r\n - We see alignment happening\r\n - `![such-alignment](https://user-images.githubusercontent.com/5950433/226707682-cfa8dbff-0908-4a34-8540-de729c62512f.png)`\r\n - We enable bisection or hermetic or cacheable builds\r\n - `![hash-values-everywhere](https://user-images.githubusercontent.com/5950433/230648803-c0765d60-bf9a-474a-b67e-4b4177dcb15c.png)`",
"title": "Alice Engineering Comms \ud83e\udeac",
"comments": [
{
"body": "# 2022-07-18 Engineering Logs\r\n\r\n- TODO\r\n - [x] @aliceoa, @pdxjohnny: Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] @pdxjohnny: Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting",
"replies": [
{
"body": "## 2022-07-18 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n- Future\r\n - Engage with Loihi community\r\n - See what we can do here, not sure yet, play with system context / mitigation inference in devcloud?\r\n - https://www.intel.com/content/www/us/en/research/neuromorphic-community.html\r\n - https://download.intel.com/newsroom/2021/new-technologies/neuromorphic-computing-loihi-2-brief.pdf\r\n - https://www.intel.com/content/www/us/en/newsroom/news/intel-unveils-neuromorphic-loihi-2-lava-software.html\r\n - \r\n- References\r\n - https://medium.com/51nodes/decentralized-schema-registry-aa662b8db12b\r\n - https://www.microsoft.com/security/blog/2021/10/06/microsofts-5-guiding-principles-for-decentralized-identities/\r\n - https://ariadne.space/2022/07/17/how-efficient-can-cat1-be/\r\n - Usage of splice\r\n - https://github.com/NVlabs/eg3d\r\n - Seeing from different perspectives, inter domain conceptual mapping, encoded sysctxs alternate mitigations\r\n - https://github.com/robmarkcole/satellite-image-deep-learning\r\n - Knitting together system contexts (Alice could use for integration of various architectures)"
}
]
},
{
"body": "# 2022-07-19 Engineering Logs\r\n\r\n- TODO\r\n - [x] @aliceoa, @pdxjohnny: Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] @pdxjohnny: Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting",
"replies": [
{
"body": "## 2022-07-19 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n- Some good spdx DAG stuff on how we turn source into build SBOM wise\r\n - https://lists.spdx.org/g/Spdx-tech/message/4659\r\n- References\r\n - https://github.com/nsmith5/rekor-sidekick\r\n - > Rekor transparency log monitoring and alerting\r\n - Leverages Open Policy Agent\r\n - Found while looking at Open Policy Agent to see if we can serialize to JSON.\r\n - Possibly use to facilitate our downstream validation\r\n - https://github.com/intel/dffml/issues/1315\r\n - https://mermaid-js.github.io/mermaid/#/c4c\r\n - Mermaid is working on native https://c4model.com support!\r\n - W3C approves DIDs!\r\n - https://blog.avast.com/dids-approved-w3c\r\n - https://www.w3.org/blog/news/archives/9618\r\n - https://www.w3.org/2022/07/pressrelease-did-rec.html.en\r\n - https://twitter.com/w3c/status/1549368259878723585/retweets/with_comments\r\n\r\n> \"Intel Corporation congratulates the DID Working Group on Decentralized Identifier (DID) 1.0 reaching W3C Recommendation status.\r\n>\r\n> DID provides a framework to unify and consolidate multiple evolving identity systems. Consolidating identity systems within a single framework is useful for validating the authenticity of information and preserving its integrity as it is moved and processed among cloud, edge, and client systems. This potentially increases the capabilities of the Web to connect and unify multiple sources of information.\r\n>\r\n> The continuing evolution of this work will be key to the development of new technologies in the fields of supply chain management and Internet of Things (IoT) devices and services. For example, a Birds of a Feather (BOF) discussion group at IETF [Supply Chain Integrity, Transparency, and Trust (SCITT)](https://datatracker.ietf.org/doc/bofreq-birkholz-supply-chain-integrity-transparency-and-trust-scitt/) has already highlighted DID as a useful approach in providing much needed structure for exchanging information through the supply chain, and the Web of Things (WoT) WG is planning to support DID for identifying and discovering IoT devices and metadata.\r\n>\r\n> Intel Corporation supports this work and encourages the DID Working Group to continue working towards the convergence of widely implemented and adopted standardized best practices for identity in its next charter.\"\r\n>\r\n> Eric Siow, Web Standards and Ecosystem Strategies Director, Intel Corporation\r\n\r\n\r\n\r\n\r\n- https://blog.devgenius.io/top-10-architecture-characteristics-non-functional-requirements-with-cheatsheat-7ad14bbb0a9b\r\n\r\n> ![image](https://user-images.githubusercontent.com/5950433/179842612-5fb02fb5-1f26-4cb4-af0d-d375b1134ace.png)\r\n\r\n- For Vol 3, on mind control\r\n - https://bigthink.com/the-present/sophists/\r\n\r\n---\r\n\r\nUnsent to Mike Scovetta: michael.scovetta (at) microsoft.com\r\n\r\nHi Mike,\r\n\r\nHope you\u2019ve been well. It\u2019s John from Intel. Thanks again to you and the team for welcoming me to the Identifying Security Threats working group meeting [2021-02-18](https://docs.google.com/document/d/1AfI0S6VjBCO0ZkULCYZGHuzzW8TPqO3zYxRjzmKvUB4/edit#heading=h.mfw2bj5svu9u) last year. We talked a bit about how Intel had a similar effort. I then changed roles hoping to get more involved with OpenSSF but then ended up getting told to be uninvolved. Now I switched roles again and involvement is in scope! Sorry for the lapse in communications.\r\n\r\nI periodically check the minutes so I joined today and asked about the \"Alpha-Omega\" project from last week\u2019s minutes which I then did some research on. We just started what looks to me to be an aligned project, coincidentally named Alice Omega Alpha: https://github.com/intel/dffml/tree/alice/entities/alice\r\n\r\nIt looks to me like Alice's mission to proactively enable developers and organizations to deliver organizationally context aware, adaptive secure by default best practices to teams aligns with project Alpha-Omega\u2019s goals.\r\n\r\nAlice is the nickname for both the entity and the architecture, the Open Architecture, which is a methodology for interpretation of existing well established, formats, protocols, and other domain specific representations of architecture. What we end up with is some JSON, YAML, or other blob of structured data that we can use to build cross language tooling focused more on policy and intent, incorporating data from arbitrary sources to create a holistic picture of software across dependency boundaries by focusing on threat models.\r\n\r\nAlice will be doing scans of open source projects and we\u2019d still love to collaborate to contribute metrics to the OpenSSF metrics database, we can easily have her shoot applicable metrics off to that DB. We\u2019ve also been looking at fusing VEX and DIDs to facilitate distributed vulnerability disclosure and patch distribution.\r\n\r\n---\r\n\r\nUnset to Jun Takei: jun.takei (at) intel.com\r\n\r\nThe W3C today issued the recommendation on DIDs. Jun I saw from Eric's\r\ncomment on the press release that the SCITT working group has an SCITT\r\nArchitecture which DID's might be suitable for.\r\n\r\nThe DFFML community is working on a project called Alice\r\nhttps://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice\r\nshe is intended to be a developer helper. She's also the way we data mine\r\nsource repositories (etc.).\r\n\r\nShe\u2019s open source with a plugin system (\"overlays\") so we can write open source code\r\nand then just add our internal integrations. This system relies on an abstraction of\r\narchitecture known as the Open Architecture. The Open Architecture, also known as\r\nAlice, is a methodology for interpreting directed graphs of domain specific architectures.\r\nAlice is the name we give both the entity and the architecture. We are hoping to\r\nhave Alice store and process information backed by directed graphs of DIDs, SBOMs, and\r\nVEX info primarily. This sounds very similar to the SCITT Architecture. We would love to\r\ncollaborate with you both to help make SCITT a success. Alice is focused on analysis of\r\nour software supply chain so as to ensure we conform to best practices. We would like\r\nthe analysis to serialize directly to an industry best practice format for that as well,\r\nwhich SCITT looks to be.\r\n\r\nTo increase the level of trust in our supply chain we would like to ensure interoperability\r\nup and down the stack. Ned is involved in the DICE space and communicated to me\r\nthat \r\n\r\nPlease let us know where things are at with your involvement with DIDs and SCITT so we\r\ncan be in sync with Intel's involvement and direction in this space. Please also let us know\r\nhow we could best establish an ongoing line of communication so as to build off and\r\ncontribute to where possible the work you're involved in.\r\n\r\nReferences:\r\n- https://datatracker.ietf.org/doc/html/draft-birkholz-scitt-architecture\r\n- https://www.w3.org/2022/07/pressrelease-did-rec.html.en\r\n- https://docs.microsoft.com/en-us/azure/confidential-ledger/architecture\r\n\r\n---\r\n\r\nUnsent\r\n\r\nTo: Jun and Mike and Dan lorenc.d (at) gmail.com\r\n\r\nI commented on the OpenSFF Stream 8 doc recommending that DIDs be looked at\r\nas a way to exchange vulnerability information.\r\n\r\nWe've been looking potentially at a hybrid DID plus rekor\r\narchitecture (DIDs eventually as a proxy to) \r\n\r\nReferences:\r\n- https://github.com/sigstore/rekor\r\n"
}
]
},
{
"body": "# 2022-07-20 Engineering Logs\r\n\r\n- TODO\r\n - [x] @aliceoa, @pdxjohnny: Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] @pdxjohnny: Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] @dffml: Get involved in SCITT\r\n - [ ] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.",
"replies": [
{
"body": "## 2022-07-20 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [x] Get involved in SCITT\r\n - [ ] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n- References\r\n - https://static.sched.com/hosted_files/ossna2022/9b/presentation.pdf\r\n - > We're starting to put everything in registries, container images, signatures, SBOMs, attestations, cat pictures, we need to slow down. Our CI pipelines are designed to pass things as directories and files between stages, why aren't we doing this with our container images? OCI already defines an Image Layout Specification that defines how to structure the data on disk, and we should normalize how this is used in our tooling. This talk looks at the value of using the OCI Layout spec, what you can do today, what issues we're facing, and a call to action for more standardization between tooling in this space.\r\n\r\n---\r\n\r\nUnsent\r\n\r\nTo: Jun and Mike and Yan\r\n\r\nI commented on the OpenSFF Stream 8 doc recommending that DIDs be looked at\r\nas a way to exchange vulnerability information.\r\n\r\nWe've been looking potentially at a hybrid DID plus rekor\r\narchitecture (DIDs eventually as a proxy to) \r\n\r\nReferences:\r\n- https://github.com/sigstore/rekor\r\n"
},
{
"body": "# 2022-07-20 Identifying Security Threats WG\r\n\r\n- Mike leading\r\n- Marta\r\n - Office hours\r\n - Place for Open Source maintainers to be able to ask community of security experts\r\n - Idea is to run first two sessions in August and September\r\n - Proposing two different timeslots to cover all geos\r\n - Question is will be be well staffed for both of those\r\n - She is collecting feedback right now on possibilities for those dates\r\n - Pinging folks who have shown interest in the past\r\n - What format?\r\n - People just show up and ask\r\n - Registration with topic they want to talk about\r\n - Allows us to prepare, consensus is currently we like this\r\n - Can grab right experts beforehand this way\r\n - Reaching out to logistics team for how we can communicate\r\n zoom links, etc.\r\n - Will test registration beginning of August\r\n - Will do doodle poll or something for slots\r\n - Jen is the one for the Zoom setup\r\n - Amir from ostif.org volunteering to answer questions\r\n - May want to do a blog in addition to twitter\r\n - Outreach maybe 4th or 5th, have the twitter points back\r\n to the blog to capture that.\r\n- Meeting time update\r\n - We have been doing this at this time for about a year or so\r\n - We previously alternated between two timeslots for Europe and Asia\r\n - Should we keep this 10 AM Pacific timeslot?\r\n - Alternate between US and APAC friendly timezone\r\n - Most other WGs are morning Pacific time\r\n- Technical Advisory Committee (TAC) update\r\n - They are tasked with making sure we are delivering on our\r\n cohesive promise, part of that is visuabliity and transparency\r\n into the work that we do.\r\n - We now have a formal reporting process\r\n - It's not a periodic we're all invited to show up to the TAC meeting\r\n one slide per project.\r\n - What we're doing\r\n - Why we're doing it\r\n - It's meant as an FYI, we are not asking for approval, we're letting them\r\n know what we're up to.\r\n - Everyone who is driving a process or project or thing, please send Mike\r\n a single slide, what is it, why are we doing it, what the status is,\r\n what's coming next, and if you need anything\r\n - Christine on metrics\r\n - Luigi for SECURITY-INSIGHTS.yml`\r\n - Mike will send out a template\r\n - Please fill and respond by Monday\r\n - Mike says the metrics work should live under a working group, maybe this one, maybe best practices\r\n - CRob might have an opinion here, as long as work gets done\r\n - As an org OpenSSF would benefit by being less siloed\r\n - Question on if we should align to streams?\r\n - LFX specific definition of metrics in mobilization paper\r\n - AR for Christine to sync with CRob and see what he thinks.\r\n - Will raise with TAC next week.\r\n- A few action items for metrics from Christine\r\n - Working groups are adopting streams from the mobilization plans\r\n- Mike: Alpha Omega\r\n - A few people were on the public call earlier\r\n - The recording will be on YouTube\r\n - Mike will give the fast version of the presentation right now\r\n - They are still hiring\r\n - Exploring ways of allocating headcount other than direct hiring\r\n - If you know anyone or are interested please apply or ping them!\r\n - Alpha\r\n - Announced Node, Python, Eclipse\r\n - Omega\r\n - Toolchain is pending\r\n - Waiting for legal approval due to the way the license for CodeQL works\r\n - Had a CVE in Node that got fixed earlier this month\r\n - RCE in JSHint that was bitrotted (unused) we removed\r\n - Two CVEs discloudsed yetserday and two more in the works (couple weeks to release ETA)\r\n - Found NodeJS vuln via systemcall tracing\r\n - It tires to query `openssl.cnf` and dumps strace logs to a repo\r\n - You then have a one stop show of show me every link package of when a binary starts, it does a DNS query\r\n - John: Sounds aligned with Alice's goals\r\n - https://sos.dev coming under Alpha-Omega\r\n - Allows us to compensate dev directly\r\n - How to participate\r\n - Improve security tools\r\n - https://sos.dev\r\n - Join working groups\r\n - Get on slack\r\n- Amir: Security Reviews\r\n - Repo is looking good\r\n - Updating with four new audits that ostif.org published last week\r\n - At almost 100 reviews from Mike (Omega work), ostif.org, and community\r\n - We're gaining traction, getting good stuff in there all the time\r\n - Might need some help with the automated testing that get's done\r\n when we upload reviews.\r\n - Feedback always welcome.\r\n- John: Collection of metric / Alpha-Omega data into shared DB\r\n - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice\r\n - https://datatracker.ietf.org/doc/html/draft-birkholz-scitt-architecture\r\n - https://www.w3.org/2022/07/pressrelease-did-rec.html.en\r\n - https://docs.microsoft.com/en-us/azure/confidential-ledger/architecture\r\n - Mike\r\n - Mike has been thinking about SCITT as a schema and rules on how one would assert facts, weither it's confidential compute or traditional permissions is impelmenetation details.\r\n - If metircs runs across you're repo and you have 30 contributors, great\r\n - As consumer, how can I discover that fact and trust that it's accruate\r\n - Could immaiget a world where things like Scorecard express the data as as SCITT assursion\r\n - You go and query that store and you say tell me everythig you know about foo and you get it all back\r\n - Until we have an implementation with WEb5 that's at at least beta, we could expore what that looks like.\r\n - John: We can do rekor for now, we'll bridge it all later target 1-2 years out\r\n - John: We have alignment. Time to execute. rekor + sigstore for metric data atteststation signed with github odic tokens. We care about data provenance. We will later bridge into web5 space used as central points of comms given DID as effectively the URL or the future. This is in realtion to what we talked to Melvin about with data provenance. We need to start planning how we are going to build up this space now so we can have provenance on thoughts later. This provenance could be for example on inference derived from provenance from training data and model training env and config. This will allow us to ensure the prioritizer make decisions based on Sprit of the law / aka intent based policy derived from Trinity of Static Analysis, Dynamic Analysis, and Human Intent.\r\n - Living Threat Model threats, mitigations, trust boundaries as initial data set for cross domain conceptual mapping of the the trinity to build pyramid of thought alignment to strategic principles.\r\n - One of our strategic plans / principles says: \"We must be able to trust the sources of all input data used for all model training was done from research studies with these ethical certifications\"\r\n - This allows us to write policies (Open Policy Agent to JSON to DID/VC/SCITT translation/application exploration still in progress) for the organizations we form and apply them as overlays to flows we execute where context appropriate. These overlaid flows define the trusted parties within that context as applicable to the active organizational policies as applicable to the top level system context.\r\n - The policy associated with the principle that consumes the overlaid trust attestations we will implement and LTM auditor for which checks the SCITT provenance information associated with the operation implementations and the operation implementation network, input network, etc. within the orchestrators trust boundary (TODO need to track usages / `reuse` of contexts `ictx`, `nctx`, etc. with something predeclared, aka at runtime if your `Operation` data structure doesn't allowlist your usage of it you can pass it to a subflow for reuse. This allows us to use the format within our orchrestration and for static analysis because we can use this same format to describe the trust boundry proeprties that other domain sepcific represenatations of architecture have, for instance we could if we were doing and Open Architecture (OA) Intermediate Representation (IR) for and ELF file we might note that the input network context is not reused from the top level system context. Where as if we did an OA IR for Python code we would say that the input network is reused from the top level system context (it has access to that memory region, whereas when you launch and ELF you look access to the parents memory region, typically).\r\n - Christine\r\n - Looking at trying to connect all the different data sources\r\n- References\r\n - [Meeting Notes](https://docs.google.com/document/d/1AfI0S6VjBCO0ZkULCYZGHuzzW8TPqO3zYxRjzmKvUB4/edit?usp=sharing)\r\n - [GitHub Workgroup Page](https://github.com/ossf/wg-identifying-security-threats)\r\n - [OpenSSF Slack](https://slack.openssf.org)\r\n - [Metric Dashboard](https://metrics.openssf.org)\r\n- TODO\r\n - @pdxjohnny\r\n - [ ] Reach out to Christine about metrics collaboration\r\n - [ ] Respond with slides for Mike if he asks"
}
]
},
{
"body": "# 2022-07-21 Engineering Logs\r\n\r\n- https://docs.rs/differential-dataflow/latest/differential_dataflow/\r\n- https://lists.spdx.org/g/Spdx-tech/message/4673\r\n - > It is not just a matter of your software, it is a fundamental design question whether to maintain separation between the logical model and its serializations. Maintaining separation shouldn't be a matter of personal preference, it's good software engineering. The OWL Web Ontology Language https://www.w3.org/TR/owl2-overview/ has an excellent diagram illustrating the separation between semantics and syntax. Several serializations are defined in OWL (Manchester Syntax, Functional Syntax, RDF/XML, OWL/XML, and Turtle), and more syntaxes have been added since (JSON-LD, RDF-star, ...).",
"replies": []
},
{
"body": "# 2022-07-23\r\n\r\n- https://blog.ciaranmcnulty.com/2022-05-12-multiple-build-contexts",
"replies": []
},
{
"body": "# 2022-07-28 Alice Intelligence/Open Architecture Working Group Initial Meeting\r\n\r\n- Meeting info\r\n - 8-9 AM Pacific\r\n - https://meet.google.com/kox-ssqn-kjd",
"replies": []
},
{
"body": "# 2022-07-25 Engineering Logs\r\n\r\n- TODO\r\n - [x] @aliceoa, @pdxjohnny: Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] @pdxjohnny: Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] @dffml: Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.",
"replies": [
{
"body": "## 2022-07-25 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n- References\r\n - https://spdx.github.io/canonical-serialisation/"
},
{
"body": "## 2022-07-25 Supply Chain Integrity, Transparency and Trust (SCITT)\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n- Links\r\n - https://github.com/intel/dffml/discussions/1406#discussioncomment-3181956\r\n - https://github.com/intel/dffml/discussions/1406#discussioncomment-3223247\r\n - https://github.com/transmute-industries/openssl-did-web-tutorial\r\n- Folks will be in Philly on Thursday for meeting\r\n - There is a remote link (see mailing list?) for the Thursday meeting\r\n- Typical global warming centric chit chat\r\n- Others at RATs meeting or busy with other IETF activities\r\n- Introductions\r\n - Kelvin Cusack\r\n - Filling in for John from his org\r\n - John Andersen\r\n - Connecting dots between this and OpenSSF\r\n - Yogesh was excited to see someone from Intel\r\n - Intel is involved in RATs but not as much here\r\n - Kiran Karunakaran\r\n - Microsoft\r\n - On Kay Williams's team\r\n - Will likely lead this meeting in the future\r\n- Upcoming Birds of Feather (BoF)\r\n - You need to register here: https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit?pli=1#heading=h.214jg0n2xjhp\r\n - There is registration for remote\r\n - Problem Statement currently scoped around software\r\n - We went back to a more scoped problem statement is that we want to form a formal working group in the IETF for SCITT.\r\n - In order to form it we have to have stuffiest people show interest in the problem space\r\n - Need problem space and charter so that it's scoped enough that the leadership is confident that the group can make progress\r\n - The early design proposal for the SCITT transparency service is that the service is content agnostic\r\n - any kind of metadata could be provided and retrieved\r\n - SBOMs, Software Test Cases, Hardware BOMs, Test results on other types of products\r\n - Electronic ballots (Ray, see mailing list), oil, gas, physical goods (U.S. govt.)\r\n - In order to gain confidence from the leadership at IETF to form the WG we felt it was critical to narrow the scope for now to software\r\n - Leadership thought scope was too big at first\r\n- Thoughts around scope\r\n - Charter is focused on software, an attainable goal\r\n - Once we have a WG we can later broaden the scope via re-chartering\r\n- We will design something that works for hardware and for software\r\n - We are hanging software window curtains but we are looking at everything\r\n- Software systems interact with everything else\r\n - Dick Brooks (REA), any manifest could be signed and processes with SCITT\r\n - It's just metadata, what was issued, how it was issued\r\n - @dffml: We are encoding system contexts, Alice, into the chain, one place where she will live.\r\n- Opens\r\n - Open Policy Agent (mentioned in meeting minutes doc future topics)\r\n - What are your plans / thoughts around Open Policy Agent and encoding policies into SCITT?\r\n - Policy can be used in two places\r\n - Policy for what can be put onto register\r\n - Some registries might constrain themselves for what types of data they allow\r\n - Policy for someone evaluating the contents of the registry to make discussions for fitness of use\r\n - REGO also considered as a policy language\r\n - Perhaps decide on multi\r\n - This policy discussion will happen in this WG for now, then maybe a sub working group\r\n - Dick Brooks: Mentions HR4081\r\n - On topic for what we are \r\n - Talked about attestations for devices containing a camera and microphone and is connected to the internet\r\n - There will need to be an attestation from the device \r\n - Dick submitted to local rep to include software attestations as well\r\n - https://www.congress.gov/bill/117th-congress/house-bill/4081\r\nhttps://www.congress.gov/bill/117th-congress/house-bill/4081/text\r\n- Producers and other parties provide content into the system, attestations, claims, recorded into SCITT ledger\r\n- Dick contacted his congress person to ask to add an amendment to HR4081\r\n - Amendment for smartphone apps to provide a trust score\r\n - Tie in with OpenSSF metrics database to grab the security of repos involved\r\n - Dick\r\n - Proposed amendment I mentioned for HR 4081:\r\n - \"Require smart phone app stores to include a software supply chain trust score for each app\". This gives consumers the ability to check trustworthiness before installing an app,\r\n- Ray: think about attestations is different than the transparency ledger\r\n - Thinks it's a lot to bite off to do both\r\n - Are there IoT folks that might have more attestation experience we could tap into?\r\n - Is there a sub-working group focused on device attestations (in response to HR4081)\r\n - Device attestations could be recorded tin the transparency ledger\r\n - TCG DICE WG is a target point of engagement (UCAN, CBOR, DID).\r\n - https://trustedcomputinggroup.org/work-groups/dice-architectures/\r\n - https://www.trustedcomputinggroup.org/wp-content/uploads/Device-Identifier-Composition-Engine-Rev69_Public-Review.pdf\r\n - Looking at hardware actively attesting\r\n - Microsoft has an Open Source implementation on GitHub\r\n - This attestation stuff starts to look at real life commerce, Ray thinks it's important to \r\n- Joshua Lock\r\n - On software attestations, I have been working on a page for the SLSA website to describe the model we're working with. I can share to the SCITT list once the change is merged.\r\n - IIRC the SCITT draft standards refer to a 'software attestation\" as a \"claim\", to disambiguate from RATS & TCG attestations\r\n- Remote Attestation and Device Attestation\r\n - Embraced COSE and CBOR\r\n - Also in SCITT\r\n - Hopefully we converge on underlying formats for both in-toto style and remote attestation style attestations\r\n- There are also NIST attestations\r\n- Vuln information mentioned by Kay as possible content inserted into SCITT\r\n - This is a goal of ours with our CVE Binary Tool engagement\r\n - We also could encode SBOMs from the systems that built them, we could patch sigstore to insert into a SCITT ledger"
}
]
},
{
"body": "# 2022-07-26 Engineering Logs\r\n\r\n- TODO\r\n - [x] @aliceoa, @pdxjohnny: Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] @pdxjohnny: Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] @dffml: Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.",
"replies": [
{
"body": "## 2022-07-26 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n\r\n![Software Analysis Trinity drawio](https://user-images.githubusercontent.com/5950433/181014158-4187950e-d0a4-4d7d-973b-dc414320e64f.svg)\r\n\r\n- Update current overlays to have lock taken on `AliceGitRepo` and then subflows with `ReadmeGitRepo` and `ContributingGitRepo`.\r\n - This way the parent flow locks and they don't have to worry about loosing the lock between operations.\r\n\r\n```console\r\n$ git grep -C 22 run_custom\r\nalice/please/contribute/recommended_community_standards/cli.py- async def cli_run_on_repo(self, repo: \"CLIRunOnRepo\"):\r\nalice/please/contribute/recommended_community_standards/cli.py- # TODO Similar to Expand being an alias of Union\r\nalice/please/contribute/recommended_community_standards/cli.py- #\r\nalice/please/contribute/recommended_community_standards/cli.py- # async def cli_run_on_repo(self, repo: 'CLIRunOnRepo') -> SystemContext[StringInputSetContext[AliceGitRepo]]:\r\nalice/please/contribute/recommended_community_standards/cli.py- # return repo\r\nalice/please/contribute/recommended_community_standards/cli.py- #\r\nalice/please/contribute/recommended_community_standards/cli.py- # Or ideally at class scope\r\nalice/please/contribute/recommended_community_standards/cli.py- #\r\nalice/please/contribute/recommended_community_standards/cli.py- # 'CLIRunOnRepo' -> SystemContext[StringInputSetContext[AliceGitRepo]]\r\nalice/please/contribute/recommended_community_standards/cli.py- async with self.parent.__class__(self.parent.config) as custom_run_dataflow:\r\nalice/please/contribute/recommended_community_standards/cli.py- async with custom_run_dataflow(\r\nalice/please/contribute/recommended_community_standards/cli.py- self.ctx, self.octx\r\nalice/please/contribute/recommended_community_standards/cli.py- ) as custom_run_dataflow_ctx:\r\nalice/please/contribute/recommended_community_standards/cli.py- # This is the type cast\r\nalice/please/contribute/recommended_community_standards/cli.py- custom_run_dataflow.op = self.parent.op._replace(\r\nalice/please/contribute/recommended_community_standards/cli.py- inputs={\r\nalice/please/contribute/recommended_community_standards/cli.py- \"repo\": AlicePleaseContributeRecommendedCommunityStandards.RepoString\r\nalice/please/contribute/recommended_community_standards/cli.py- }\r\nalice/please/contribute/recommended_community_standards/cli.py- )\r\nalice/please/contribute/recommended_community_standards/cli.py- # Set the dataflow to be the same flow\r\nalice/please/contribute/recommended_community_standards/cli.py- # TODO Reuse ictx? Is that applicable?\r\nalice/please/contribute/recommended_community_standards/cli.py- custom_run_dataflow.config.dataflow = self.octx.config.dataflow\r\nalice/please/contribute/recommended_community_standards/cli.py: await dffml.run_dataflow.run_custom(\r\nalice/please/contribute/recommended_community_standards/cli.py- custom_run_dataflow_ctx, {\"repo\": repo},\r\nalice/please/contribute/recommended_community_standards/cli.py- )\r\n```"
}
]
},
{
"body": "# 2022-07-27 Engineering Logs\r\n\r\n- References\r\n - kaniko coder k3d digitalocean\r\n - The following were issues with kind which might also effect us\r\n - https://github.com/GoogleContainerTools/kaniko/issues/2164\r\n - https://github.com/tektoncd/pipeline/commit/6542823c8330581fcfe6ba5a8ea7682a06510bcb\r\n - It doesn't look like kaniko currently supports multi context builds\r\n - Great example of communication and meeting procedures link to code\r\n - https://lists.spdx.org/g/Spdx-tech/message/4699",
"replies": [
{
"body": "## 2022-07-27 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n\r\n### Refactoring and Thinking About Locking of Repos for Contributions\r\n\r\n- Metadata\r\n - Date: 2022-07-27 20:30 UTC -7\r\n- Saving this diff which was some work on dynamic application of overlay\r\n so as to support fixup of the OpImp for `meta_issue_body()`'s inputs.\r\n - We are going to table this for now for time reasons, but if someone\r\n wants to pick it up before @pdxjohnny is back in September, please\r\n give it a go (create an issue).\r\n- Noticed that we have an issue with adding new files and locking. The current\r\n lock is on the `git_repository/GitRepoSpec`.\r\n - We then convert to `AliceGitRepo`, at which point anything take `AliceGitRepo`\r\n- `alice`\r\n - Goal: Display Alice and software analysis trinity\r\n - https://free-images.com/search/?q=alice%27s+adventures+in+wonderland&cat=st\r\n - https://free-images.com/display/de_alices_abenteuer_im_43.html\r\n - https://github.com/KhorSL/ASCII-ART\r\n - Completed in d067273f8571b6a56733336663aaebc3acb3a701\r\n\r\n![alice looking up](https://user-images.githubusercontent.com/5950433/181431145-18cfc8a7-28c8-486f-80f9-8b250e0b0943.png)\r\n\r\n```console\r\n$ python ascii_art.py /mnt/c/Users/Johnny/Downloads/alice-looking-up-white-background.png\r\n```\r\n\r\n```console\r\n$ alice\r\nusage: alice [-h] [-log LOG] {please,shouldi,threats} ...\r\n\r\n .,*&&888@@#&:,\r\n .:&::,...,:&#@@@#:.\r\n .o,. ..:8@@#@@+\r\n .8o+,+o*+*+,+:&#@@#8@@.\r\n &8&###@#&..*:8#@@#@#@@&+.\r\n ,@:#@##@@8,:&#@@@###@88@@.\r\n ,#@8&#@@@#o:#@@@@#8#@#8+&#.\r\n +8####@@@@###@@@888#@@@#oo#.\r\n .*8@###@@@@@@@@@#o*#@@#@@#8o@,\r\n +###@#o8&#@@##8::##@@@&&#@8#&+\r\n o@8&#&##::.,o&+88#&8##8*@@#@#,\r\n .##888&&oo#&o8###8&o##8##&####8,\r\n .&#@8&:+o+&@@@#8#&8:8@@@@@#8@@@oo+\r\n ,&&#@##oo+*:@###&#88,@@@@#@o&##&8#@o,.\r\n ,#&###@@8:*,#o&@@@@##:&#@###*.&o++o#@@#&+\r\n o8&8o8@#8+,,#.88#@#&@&&#@##++*&#o&&&#@@@@.\r\n *88:,#8&#,o+:+@&8#:8@8&8#@@&o++,*++*+:#@@*.\r\n .+#:o###@8o&8*@o&o8@o888@@@o+:o*&&,@#:&@@@,\r\n *+&@8&#@o#8+8*#+8#+88@@@@@@&@###8##@8:*,\r\n +o.@##@@@&88@*8@:8@@@@@@:.. ,8@:++.\r\n +&++8@@@@##@@@@@@@@@@@+ 88\r\n &. *@8@:+##o&888#@@@, .#+\r\n &. ,@+o,.::+*+*:&#&, ,@.\r\n &. .@8*,. ,*+++.+* :8+\r\n :+ .#@::. .8:.:** .8@@o,\r\n .o. #@+ :@,.&* .:@@@@@@8**.\r\n +&. :@o,+.*o,*, .*@@@@@@@@@@#o\r\n .*:&o. 8@o:,*:, .o@@#8&&@@@@#@@@*\r\n ,*:+:::o.*&8+,++ ,&@@#: * :@@88@@@#:.\r\n ,::**:o:.,&*+*8: *8@@##o *,.8@@#8#@#@#+\r\n *:+*&o8:. ,o,o:8@+o@@88:*@+ +: +#@#####8##&.\r\n ,:&::88&, .&:#o#@@@#,+&&*#&. .:,.&#@#88#####&,\r\n +::o+&8:. :##88@@@@:.:8o+&8&. .. +8###&8&##&88*\r\n .:*+*.8#: ,o*.+&@@#@8,,o8*+8##+ .+#8##8&&#8888:.\r\n ,:o., &#8. .:8*. .o, &#,*:8:+,&*:, .8@@#o&&##8:&#8.\r\n .*o.*,+o8#* +8&, .::. .88.+:8o: ,+:, ,o#@#8&o8##&#8+\r\n +o, .+,,o#8+,8@o**.,o*, :8o +*8#* +&, ,*o@@#@&8&oo8&:,\r\n oo*+,,,*8@#..&@8:**:oo+. +8#* *+#@:...oo+ .**:8@@@ooo&:&o##+\r\n ::+..,++#@,.:##o&o**,....oo#++#8#@:.,:8&:.....*&@@#:oo*&oo&#@*\r\n .+**:*8@o,+##&o:+,,,+,,o*8#,,8@#@:,,+*o*++,,,,+&#@8*8o88&::*. .,,,,,++,\r\n ..8@++#@#88:,,,.,,,:+#&,,#@@#:,,.,&o*,.+++*:#@8+:*+. ......,:+*&,,.....\r\n +:&8#@@##8&+,,,***@&,.8@@@*,,,.:o8&o&*o&o&o. .,.****::*:o*:o*o+,.\r\n ...,*:*o&&o*8@@&o8@@@8+,,+:&&:+,... ,++*&oo&8&&&oo#@##8#&8:.\r\n o@#@@@@#@@@@@@@,..... ..,,.+*::o#@##@##@#@#########@@@8:,.\r\n ,@##@@88#@@@@@8 .:***oo*#8###8#@#@#@#@####@#@###@@#8&#:\r\n 8+.,8+..,*o#@+ ,o+o88&88###@8#######@8#8#88#8#88##88#&\r\n *o *+ #8 . ,*o&#@##@@@@@@@@@######8#888&&oo:8:\r\n 8, ,& +@* .ooo&#@@@@@#@@@@@@####@##8#8##oo:o&:,\r\n +& &, .@#. .:8#@@@@@@@@@@##8#####8#o&&#8*:8&&8:\r\n o* ,o o@& +o#@@@@@@@@#o&o88:&+ooo&:*::o:o&**o.:*+\r\n .8. 8.,o#8 .+&#@@@@@@@@&o+,::*+*:+:, ,. ,.. .,. ,.\r\n 8. 8.,.&@:*:&@@@@@@@@8o+, ,.\r\n :@o:#,,o8&:o&@@@@#&:+.\r\n .@@@@@@@@@@@#8&o+,\r\n ,*:&#@#&o*,..\r\n\r\n /\\\r\n / \\\r\n Intent\r\n / \\\r\n / \\\r\n / \\\r\n / \\\r\n / \\\r\n / Alice is Here \\\r\n / \\\r\n / \\\r\n /______________________\\\r\n\r\n Dynamic Analysis Static Analysis\r\n\r\n Alice's source code: https://github.com/intel/dffml/tree/alice/entities/alice\r\n How we built Alice: https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice\r\n How to extend Alice: https://github.com/intel/dffml/blob/alice/entities/alice/CONTRIBUTING.rst\r\n Comment to get involved: https://github.com/intel/dffml/discussions/1406\r\n\r\n\r\npositional arguments:\r\n {please,shouldi,threats}\r\n\r\noptional arguments:\r\n -h, --help show this help message and exit\r\n -log LOG Logging Level\r\n```\r\n\r\n- TODO\r\n - [ ] Auto fork repo before push\r\n - [ ] Update origin to push to\r\n - [ ] Create PR\r\n - [ ] Update README to fix demos\r\n - [ ] Update CONTRIBUTING with tutorial on adding\r\n `CONTRIBUTING.md` check and contribution\r\n\r\n**entities/alice/alice/timelines.py**\r\n\r\n```python\r\n\"\"\"\r\nHelpers for the timelines we support\r\n\"\"\"\r\n\r\n# Trinity Day 0\r\nALICE_DAY_0_GREGORIAN = datetime.datetime(2022, 4, 16)\r\n\r\ndef date_alice_from_gregorian(date: str) -> int:\r\n # TODO\r\n return ALICE_DAY_0_GREGORIAN\r\n```\r\n\r\n```diff\r\ndiff --git a/dffml/base.py b/dffml/base.py\r\nindex fea0ef7220..9d6cd886fa 100644\r\n--- a/dffml/base.py\r\n+++ b/dffml/base.py\r\n@@ -237,6 +237,7 @@ def convert_value(arg, value, *, dataclass=None):\r\n # before checking if the value is an instance of that\r\n # type. Since it doesn't make sense to check if the\r\n # value is an instance of something that's not a type.\r\n+ print(possible_type, value)\r\n if isinstance(possible_type, type) and isinstance(\r\n value, possible_type\r\n ):\r\ndiff --git a/dffml/df/system_context/system_context.py b/dffml/df/system_context/system_context.py\r\nindex e055a343f1..063547ad0c 100644\r\n--- a/dffml/df/system_context/system_context.py\r\n+++ b/dffml/df/system_context/system_context.py\r\n@@ -90,11 +90,11 @@ class SystemContextConfig:\r\n # links: 'SystemContextConfig'\r\n overlay: Union[\"SystemContextConfig\", DataFlow] = field(\r\n \"The overlay we will apply with any overlays to merge within it (see default overlay usage docs)\",\r\n- default=APPLY_INSTALLED_OVERLAYS,\r\n+ default=None,\r\n )\r\n orchestrator: Union[\"SystemContextConfig\", BaseOrchestrator] = field(\r\n \"The system context who's default flow will be used to produce an orchestrator which will be used to execute this system context including application of overlays\",\r\n- default_factory=lambda: MemoryOrchestrator,\r\n+ default=None,\r\n )\r\n \r\n \r\n@@ -131,6 +131,7 @@ class SystemContext(BaseDataFlowFacilitatorObject):\r\n )\r\n # TODO(alice) Apply overlay\r\n if self.config.overlay not in (None, APPLY_INSTALLED_OVERLAYS):\r\n+ print(self.config.overlay)\r\n breakpoint()\r\n raise NotImplementedError(\r\n \"Application of overlays within SystemContext class entry not yet supported\"\r\ndiff --git a/dffml/high_level/dataflow.py b/dffml/high_level/dataflow.py\r\nindex d180b5c302..d595ae1cb4 100644\r\n--- a/dffml/high_level/dataflow.py\r\n+++ b/dffml/high_level/dataflow.py\r\n@@ -206,12 +206,25 @@ async def run(\r\n # the of the one that got passed in and the overlay.\r\n if inspect.isclass(overlay):\r\n overlay = overlay()\r\n+ # TODO Move this into Overlay.load. Create a system context to\r\n+ # execute the overlay if it is not already.\r\n+ known_overlay_types = (DataFlow, SystemContext)\r\n+ if not isinstance(overlay, known_overlay_types):\r\n+ raise NotImplementedError(f\"{overlay} is not a known type {known_overlay_types}\")\r\n+ if isinstance(overlay, DataFlow):\r\n+ overlay = SystemContext(\r\n+ upstream=overlay,\r\n+ )\r\n # TODO(alice) overlay.deployment(\"native.python.overlay.apply\")\r\n apply_overlay = overlay.deployment()\r\n async for _ctx, result in apply_overlay(\r\n dataflow=dataflow,\r\n ):\r\n+ print(\"FEEDFACE\", _ctx, result)\r\n+ breakpoint()\r\n+ return\r\n continue\r\n+\r\n # TODO\r\n resultant_system_context = SystemContext(\r\n upstream=result[\"overlays_merged\"], overlay=None,\r\ndiff --git a/dffml/overlay/overlay.py b/dffml/overlay/overlay.py\r\nindex 13a50d9c10..0a01d38de9 100644\r\n--- a/dffml/overlay/overlay.py\r\n+++ b/dffml/overlay/overlay.py\r\n@@ -124,7 +124,7 @@ DFFML_MAIN_PACKAGE_OVERLAY = DataFlow(\r\n stage=Stage.OUTPUT,\r\n inputs={\r\n \"merged\": DataFlowAfterOverlaysMerged,\r\n- \"dataflow_we_are_applying_overlays_to_by_running_overlay_dataflow_and_passing_as_an_input\": DataFlowWeAreApplyingOverlaysToByRunningOverlayDataflowAndPassingAsAnInput,\r\n+ \"upstream\": DataFlowWeAreApplyingOverlaysToByRunningOverlayDataflowAndPassingAsAnInput,\r\n },\r\n outputs={\"overlayed\": DataFlowAfterOverlaysApplied,},\r\n multi_output=False,\r\n@@ -208,15 +208,12 @@ merge_implementations(\r\n DFFML_OVERLAYS_INSTALLED.update(auto_flow=True)\r\n \r\n # Create Class for calling operations within the System Context as methods\r\n-DFFMLOverlaysInstalled = SystemContext.subclass(\r\n- \"DFFMLOverlaysInstalled\",\r\n- {\r\n- \"upstream\": {\"default_factory\": lambda: DFFML_OVERLAYS_INSTALLED},\r\n- # TODO(alice) We'll need to make sure we have code to instantiate and\r\n- # instance of a class if only a class is given an not an instance.\r\n- \"overlay\": {\"default_factory\": lambda: None},\r\n- \"orchestrator\": {\"default_factory\": lambda: MemoryOrchestrator()},\r\n- },\r\n+DFFMLOverlaysInstalled = SystemContext(\r\n+ upstream=DFFML_OVERLAYS_INSTALLED,\r\n+ # TODO(alice) We'll need to make sure we have code to instantiate and\r\n+ # instance of a class if only a class is given an not an instance.\r\n+ overlay=None,\r\n+ orchestrator=MemoryOrchestrator(),\r\n )\r\n \r\n # Callee\r\ndiff --git a/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py b/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py\r\nindex 46d20c8c85..fff5d4928b 100644\r\n--- a/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py\r\n+++ b/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py\r\n@@ -18,6 +18,14 @@ from ....recommended_community_standards import AliceGitRepo, AlicePleaseContrib\r\n from ....dffml.operations.git.contribute import AlicePleaseContributeRecommendedCommunityStandardsOverlayGit\r\n \r\n \r\n+GitHubIssue = NewType(\"GitHubIssue\", str)\r\n+\r\n+\r\n+@dataclasses.dataclass\r\n+class RecommendedCommunityStandardContribution:\r\n+ path: pathlib.Path\r\n+ issue: GitHubIssue\r\n+\r\n \r\n class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n \"\"\"\r\n@@ -39,6 +47,7 @@ class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n MetaIssueTitle = NewType(\"MetaIssueTitle\", str)\r\n MetaIssueBody = NewType(\"MetaIssueBody\", str)\r\n \r\n+ # TODO This should only be run if there is a need for a README\r\n # body: Optional['ContributingIssueBody'] = \"References:\\n- https://docs.github.com/articles/setting-guidelines-for-repository-contributors/\",\r\n async def readme_issue(\r\n self,\r\n@@ -79,13 +88,40 @@ class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n \"\"\"\r\n ).lstrip()\r\n \r\n- # TODO(alice) There is a bug with Optional which can be revield by use here\r\n+\r\n @staticmethod\r\n+ async def readme_contribution(\r\n+ issue: \"ReadmeIssue\",\r\n+ path: AlicePleaseContributeRecommendedCommunityStandards.ReadmePath,\r\n+ ) -> RecommendedCommunityStandardContribution:\r\n+ return RecommendedCommunityStandardContribution(\r\n+ path=path,\r\n+ issue=issue,\r\n+ )\r\n+\r\n+\r\n+ \"\"\"\r\n+ @dffml.op(\r\n+ stage=dffml.Stage.OUTPUT,\r\n+ )\r\n+ async def collect_recommended_community_standard_contributions(\r\n+ self,\r\n+ ) -> List[RecommendedCommunityStandardContribution]:\r\n+ async with self.octx.ictx.definitions(self.ctx) as od:\r\n+ return [item async for item in od.inputs(RecommendedCommunityStandardContribution)]\r\n+ \"\"\"\r\n+\r\n+\r\n+ # TODO(alice) There is a bug with Optional which can be revield by use here\r\n def meta_issue_body(\r\n repo: AliceGitRepo,\r\n base: AlicePleaseContributeRecommendedCommunityStandardsOverlayGit.BaseBranch,\r\n- readme_path: AlicePleaseContributeRecommendedCommunityStandards.ReadmePath,\r\n- readme_issue: ReadmeIssue,\r\n+ # recommended_community_standard_contributions: List[RecommendedCommunityStandardContribution],\r\n+ # TODO On @op inspect paramter if Collect is found on an input, wrap the\r\n+ # operation in a subflow and add a generic version of\r\n+ # collect_recommended_community_standard_contributions to the flow as an\r\n+ # autostart or triggered via auto start operation.\r\n+ # recommended_community_standard_contributions: Collect[List[RecommendedCommunityStandardContribution]],\r\n ) -> \"MetaIssueBody\":\r\n \"\"\"\r\n >>> AlicePleaseContributeRecommendedCommunityStandardsGitHubIssueOverlay.meta_issue_body(\r\n@@ -98,6 +134,7 @@ class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n - [] [License](https://github.com/intel/dffml/blob/main/LICENSE)\r\n - [] Security\r\n \"\"\"\r\n+ readme_issue, readme_path = recommended_community_standard_contributions[0]\r\n return \"\\n\".join(\r\n [\r\n \"- [\"\r\n```"
}
]
},
{
"body": "# 2022-07-28 Engineering Logs",
"replies": [
{
"body": "## 2022-07-28 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n- Noticed that we have an issue with adding new files and locking. The current\r\n lock is on the `git_repository/GitRepoSpec`.\r\n - We then convert to `AliceGitRepo`, at which point anything take `AliceGitRepo`\r\n - alice: please: contribute: recommended community standards: Refactoring into overlays associated with each file contributed\r\n - Completed in 1a71dbe3ab3743430ce2783f4210a6cd807c36a1\r\n\r\n### 43\r\n\r\n```\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed.append(dffml.Input(value=repo, definition=definition, origin=('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result')))\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed\r\n[Input(value=origin, definition=writable.github.remote.origin), Input(value=master, definition=repo.git.base.branch), Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-hxnacg5_', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]\r\n```\r\n\r\n- Attempting to figure out why an operation is not being called\r\n - `contribute_readme_md` should be getting `base`, but is not.\r\n\r\n```\r\n{'_': {ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]},\r\n 'alternate_definitions': [],\r\n 'by_origin': {('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]},\r\n 'check_for_default_value': [repo.git.base.branch],\r\n 'contexts': [MemoryInputNetworkContextEntry(ctx=Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo), definitions={ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)], repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)], ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]}, by_origin={('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]})],\r\n 'ctx': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'dataflow': <dffml.df.types.DataFlow object at 0x7f177ec3b7f0>,\r\n 'definition': repo.git.base.branch,\r\n 'gather': {'base': [],\r\n 'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]},\r\n 'handle_string': \"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', \"\r\n \"URL='https://github.com/pdxjohnny/testaaaa'), \"\r\n 'definition=ReadmeGitRepo)',\r\n 'input_flow': InputFlow(inputs={'repo': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme': 'result'}], 'base': ['seed'], 'commit_message': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message': 'result'}]}, conditions=[]),\r\n 'input_name': 'base',\r\n 'input_source': 'seed',\r\n 'input_sources': ['seed'],\r\n 'item': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'operation': Operation(name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', inputs={'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}, outputs={'result': repo.readme.git.branch}, stage=<Stage.PROCESSING: 'processing'>, conditions=[], expand=[], instance_name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', validator=False, retry=0),\r\n 'origin': 'seed',\r\n 'origins': ['seed'],\r\n 'pprint': <function pprint at 0x7f1782d94670>,\r\n 'rctx': <dffml.df.memory.MemoryRedundancyCheckerContext object at 0x7f177ec5e220>,\r\n 'self': <dffml.df.memory.MemoryInputNetworkContext object at 0x7f177ec3ba30>}\r\n> /home/pdxjohnny/Documents/python/dffml/dffml/df/memory.py(788)gather_inputs()\r\n-> return\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n(Pdb) self.ctxhd.keys()\r\ndict_keys([\"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)\"])\r\n(Pdb) from pprint import pprint\r\n(Pdb) pprint(inputs.definitions)\r\n{ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]}\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n```\r\n\r\n- Suspect discarded because of mismatched origin, if not that, will check definition\r\n - Found out that it was seed vs. output origin mismatch\r\n - Found out that BaseBranch comes from OverlayGit\r\n - Registered OverlayGit as an overlay of OverlayReadme to that it's definitions get loaded\r\n - This way `auto_flow` will make the expected origin the output from OverlayGit operations\r\n rather than seed (the default when no matching outputs are seen on DataFlow init).\r\n - We found it created an infinite loop\r\n - Will try reusing redundancy checker, that seems to be doing well\r\n- https://github.com/intel/dffml/issues/1408\r\n- Now debugging why `readme_pr` not called, OverlayGit definitions were seen earlier\r\n on subflow start to be present, must be something else.\r\n - The logs tell us that alice_contribute_readme is returning `None`, which means\r\n that the downstream operation is not called, since None means no return value\r\n in this case.\r\n\r\n```\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme Outputs: None\r\n```\r\n\r\n- Future\r\n - `run_custom` Optionally support forward subflow\r\n- TODO\r\n - [ ] Set definition proprety `AliceGitRepo.lock` to `True`\r\n\r\n\r\n### 44\r\n\r\n- Found out that util: subprocess: run command events: Do not return after yield of stdout/err\r\n - Fixed in b6eea6ed4549f9e7a89aab6306a51213b2bf36c9\r\n\r\n```console\r\n$ (for i in $(echo determin_base_branch readme_pr_body contribute_readme_md github_owns_remote alice_contribute_readme); do grep -rn \"${i} Outputs\" .output/2022-07-28-14-11.txt; done) | sort | uniq | sort\r\n354:DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote Outputs: {'result': 'origin'}\r\n361:DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch Outputs: {'result': 'master'}\r\n450:DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body Outputs: {'result': 'Closes: https://github.com/pdxjohnny/testaaaa/issues/188'}\r\n472:DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md Outputs: {'result': 'alice-contribute-recommended-community-standards-readme'}\r\n479:DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme Outputs: None\r\n```\r\n\r\n```\r\n(Pdb)\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md Outputs: {'result': 'alice-contribute-recommended-community-standards-readme'}\r\n(Pdb) pprint(readme_dataflow.flow['alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr'].inputs)\r\n{'base': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch': 'result'}],\r\n 'body': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body': 'result'}],\r\n 'head': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md': 'result'}],\r\n 'origin': ['seed'],\r\n 'repo': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme': 'result'}],\r\n 'title': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_title': 'result'}]}\r\n```\r\n\r\n- origin is set to seed\r\n - `'origin': ['seed']` was there because `OverlayGitHub.github_owns_remote` is not in the flow\r\n - We forgot add it to `entry_points.txt`, added\r\n\r\n```console\r\n$ dffml service dev export alice.cli:AlicePleaseContributeCLIDataFlow | tee alice.please.contribute.recommended_community_standards.json\r\n$ (echo -e 'HTTP/1.0 200 OK\\n' && dffml dataflow diagram -shortname alice.please.contribute.recommended_community_standards.json) | nc -Nlp 9999;\r\n```\r\n\r\n- Opens\r\n - `guessed_repo_string_means_no_git_branch_given` is feeding `git_repo_default_branch` but `dffml dataflow diagram` just have a bug because it's not showing the connection.\r\n\r\n```mermaid\r\ngraph TD\r\nsubgraph a759a07029077edc5c37fea0326fa281[Processing Stage]\r\nstyle a759a07029077edc5c37fea0326fa281 fill:#afd388b5,stroke:#a4ca7a\r\nsubgraph 8cfb8cd5b8620de4a7ebe0dfec00771a[cli_has_repos]\r\nstyle 8cfb8cd5b8620de4a7ebe0dfec00771a fill:#fff4de,stroke:#cece71\r\nd493c90433d19f11f33c2d72cd144940[cli_has_repos]\r\ne07552ee3b6b7696cb3ddd786222eaad(cmd)\r\ne07552ee3b6b7696cb3ddd786222eaad --> d493c90433d19f11f33c2d72cd144940\r\ncee6b5fdd0b6fbd0539cdcdc7f5a3324(wanted)\r\ncee6b5fdd0b6fbd0539cdcdc7f5a3324 --> d493c90433d19f11f33c2d72cd144940\r\n79e1ea6822bff603a835fb8ee80c7ff3(result)\r\nd493c90433d19f11f33c2d72cd144940 --> 79e1ea6822bff603a835fb8ee80c7ff3\r\nend\r\nsubgraph 0c2b64320fb5666a034794bb2195ecf0[cli_is_asking_for_recommended_community_standards]\r\nstyle 0c2b64320fb5666a034794bb2195ecf0 fill:#fff4de,stroke:#cece71\r\n222ee6c0209f1f1b7a782bc1276868c7[cli_is_asking_for_recommended_community_standards]\r\n330f463830aa97e88917d5a9d1c21500(cmd)\r\n330f463830aa97e88917d5a9d1c21500 --> 222ee6c0209f1f1b7a782bc1276868c7\r\nba29b52e9c5aa88ea1caeeff29bfd491(result)\r\n222ee6c0209f1f1b7a782bc1276868c7 --> ba29b52e9c5aa88ea1caeeff29bfd491\r\nend\r\nsubgraph eac58e8db2b55cb9cc5474aaa402c93e[cli_is_meant_on_this_repo]\r\nstyle eac58e8db2b55cb9cc5474aaa402c93e fill:#fff4de,stroke:#cece71\r\n6c819ad0228b0e7094b33e0634da9a38[cli_is_meant_on_this_repo]\r\ndc7c5f0836f7d2564c402bf956722672(cmd)\r\ndc7c5f0836f7d2564c402bf956722672 --> 6c819ad0228b0e7094b33e0634da9a38\r\n58d8518cb0d6ef6ad35dc242486f1beb(wanted)\r\n58d8518cb0d6ef6ad35dc242486f1beb --> 6c819ad0228b0e7094b33e0634da9a38\r\n135ee61e3402d6fcbd7a219b0b4ccd73(result)\r\n6c819ad0228b0e7094b33e0634da9a38 --> 135ee61e3402d6fcbd7a219b0b4ccd73\r\nend\r\nsubgraph 37887bf260c5c8e9bd18038401008bbc[cli_run_on_repo]\r\nstyle 37887bf260c5c8e9bd18038401008bbc fill:#fff4de,stroke:#cece71\r\n9d1042f33352800e54d98c9c5a4223df[cli_run_on_repo]\r\ne824ae072860bc545fc7d55aa0bca479(repo)\r\ne824ae072860bc545fc7d55aa0bca479 --> 9d1042f33352800e54d98c9c5a4223df\r\n40109d487bb9f08608d8c5f6e747042f(result)\r\n9d1042f33352800e54d98c9c5a4223df --> 40109d487bb9f08608d8c5f6e747042f\r\nend\r\nsubgraph 66ecd0c1f2e08941c443ec9cd89ec589[guess_repo_string_is_directory]\r\nstyle 66ecd0c1f2e08941c443ec9cd89ec589 fill:#fff4de,stroke:#cece71\r\n737d719a0c348ff65456024ddbc530fe[guess_repo_string_is_directory]\r\n33d806f9b732bfd6b96ae2e9e4243a68(repo_string)\r\n33d806f9b732bfd6b96ae2e9e4243a68 --> 737d719a0c348ff65456024ddbc530fe\r\ndd5aab190ce844673819298c5b8fde76(result)\r\n737d719a0c348ff65456024ddbc530fe --> dd5aab190ce844673819298c5b8fde76\r\nend\r\nsubgraph 4ea6696419c4a0862a4f63ea1f60c751[create_branch_if_none_exists]\r\nstyle 4ea6696419c4a0862a4f63ea1f60c751 fill:#fff4de,stroke:#cece71\r\n502369b37882b300d6620d5b4020f5b2[create_branch_if_none_exists]\r\nfdcb9b6113856222e30e093f7c38065e(name)\r\nfdcb9b6113856222e30e093f7c38065e --> 502369b37882b300d6620d5b4020f5b2\r\nbdcf4b078985f4a390e4ed4beacffa65(repo)\r\nbdcf4b078985f4a390e4ed4beacffa65 --> 502369b37882b300d6620d5b4020f5b2\r\n5a5493ab86ab4053f1d44302e7bdddd6(result)\r\n502369b37882b300d6620d5b4020f5b2 --> 5a5493ab86ab4053f1d44302e7bdddd6\r\nend\r\nsubgraph b1d510183f6a4c3fde207a4656c72cb4[determin_base_branch]\r\nstyle b1d510183f6a4c3fde207a4656c72cb4 fill:#fff4de,stroke:#cece71\r\n476aecd4d4d712cda1879feba46ea109[determin_base_branch]\r\nff47cf65b58262acec28507f4427de45(default_branch)\r\nff47cf65b58262acec28507f4427de45 --> 476aecd4d4d712cda1879feba46ea109\r\n150204cd2d5a921deb53c312418379a1(result)\r\n476aecd4d4d712cda1879feba46ea109 --> 150204cd2d5a921deb53c312418379a1\r\nend\r\nsubgraph 2a08ff341f159c170b7fe017eaad2f18[git_repo_to_alice_git_repo]\r\nstyle 2a08ff341f159c170b7fe017eaad2f18 fill:#fff4de,stroke:#cece71\r\n7f74112f6d30c6289caa0a000e87edab[git_repo_to_alice_git_repo]\r\ne58180baf478fe910359358a3fa02234(repo)\r\ne58180baf478fe910359358a3fa02234 --> 7f74112f6d30c6289caa0a000e87edab\r\n9b92d5a346885079a2821c4d27cb5174(result)\r\n7f74112f6d30c6289caa0a000e87edab --> 9b92d5a346885079a2821c4d27cb5174\r\nend\r\nsubgraph b5d35aa8a8dcd28d22d47caad02676b0[guess_repo_string_is_url]\r\nstyle b5d35aa8a8dcd28d22d47caad02676b0 fill:#fff4de,stroke:#cece71\r\n0de074e71a32e30889b8bb400cf8db9f[guess_repo_string_is_url]\r\nc3bfe79b396a98ce2d9bfe772c9c20af(repo_string)\r\nc3bfe79b396a98ce2d9bfe772c9c20af --> 0de074e71a32e30889b8bb400cf8db9f\r\n2a1c620b0d510c3d8ed35deda41851c5(result)\r\n0de074e71a32e30889b8bb400cf8db9f --> 2a1c620b0d510c3d8ed35deda41851c5\r\nend\r\nsubgraph 60791520c6d124c0bf15e599132b0caf[guessed_repo_string_is_operations_git_url]\r\nstyle 60791520c6d124c0bf15e599132b0caf fill:#fff4de,stroke:#cece71\r\n102f173505d7b546236cdeff191369d4[guessed_repo_string_is_operations_git_url]\r\n4934c6211334318c63a5e91530171c9b(repo_url)\r\n4934c6211334318c63a5e91530171c9b --> 102f173505d7b546236cdeff191369d4\r\n8d0adc31da1a0919724baf73d047743c(result)\r\n102f173505d7b546236cdeff191369d4 --> 8d0adc31da1a0919724baf73d047743c\r\nend\r\nsubgraph f2c7b93622447999daab403713239ada[guessed_repo_string_means_no_git_branch_given]\r\nstyle f2c7b93622447999daab403713239ada fill:#fff4de,stroke:#cece71\r\nc8294a87e7aae8f7f9cb7f53e054fed5[guessed_repo_string_means_no_git_branch_given]\r\n5567dd8a6d7ae4fe86252db32e189a4d(repo_url)\r\n5567dd8a6d7ae4fe86252db32e189a4d --> c8294a87e7aae8f7f9cb7f53e054fed5\r\nd888e6b64b5e3496056088f14dab9894(result)\r\nc8294a87e7aae8f7f9cb7f53e054fed5 --> d888e6b64b5e3496056088f14dab9894\r\nend\r\nsubgraph 113addf4beee5305fdc79d2363608f9d[github_owns_remote]\r\nstyle 113addf4beee5305fdc79d2363608f9d fill:#fff4de,stroke:#cece71\r\n049b72b81b976fbb43607bfeeb0464c5[github_owns_remote]\r\n6c2b36393ffff6be0b4ad333df2d9419(remote)\r\n6c2b36393ffff6be0b4ad333df2d9419 --> 049b72b81b976fbb43607bfeeb0464c5\r\n19a9ee483c1743e6ecf0a2dc3b6f8c7a(repo)\r\n19a9ee483c1743e6ecf0a2dc3b6f8c7a --> 049b72b81b976fbb43607bfeeb0464c5\r\nb4cff8d194413f436d94f9d84ece0262(result)\r\n049b72b81b976fbb43607bfeeb0464c5 --> b4cff8d194413f436d94f9d84ece0262\r\nend\r\nsubgraph 43a22312a3d4f5c995c54c5196acc50a[create_meta_issue]\r\nstyle 43a22312a3d4f5c995c54c5196acc50a fill:#fff4de,stroke:#cece71\r\nd2345f23e5ef9f54c591c4a687c24575[create_meta_issue]\r\n1d79010ee1550f057c531130814c40b9(body)\r\n1d79010ee1550f057c531130814c40b9 --> d2345f23e5ef9f54c591c4a687c24575\r\n712d4318e59bd2dc629f0ddebb257ca3(repo)\r\n712d4318e59bd2dc629f0ddebb257ca3 --> d2345f23e5ef9f54c591c4a687c24575\r\n38a94f1c2162803f571489d707d61021(title)\r\n38a94f1c2162803f571489d707d61021 --> d2345f23e5ef9f54c591c4a687c24575\r\n2b22b4998ac3e6a64d82e0147e71ee1b(result)\r\nd2345f23e5ef9f54c591c4a687c24575 --> 2b22b4998ac3e6a64d82e0147e71ee1b\r\nend\r\nsubgraph f77af509c413b86b6cd7e107cc623c73[meta_issue_body]\r\nstyle f77af509c413b86b6cd7e107cc623c73 fill:#fff4de,stroke:#cece71\r\n69a9852570720a3d35cb9dd52a281f71[meta_issue_body]\r\n480d1cc478d23858e92d61225349b674(base)\r\n480d1cc478d23858e92d61225349b674 --> 69a9852570720a3d35cb9dd52a281f71\r\n37035ea5a06a282bdc1e1de24090a36d(readme_issue)\r\n37035ea5a06a282bdc1e1de24090a36d --> 69a9852570720a3d35cb9dd52a281f71\r\nfdf0dbb8ca47ee9022b3daeb8c7df9c0(readme_path)\r\nfdf0dbb8ca47ee9022b3daeb8c7df9c0 --> 69a9852570720a3d35cb9dd52a281f71\r\n428ca84f627c695362652cc7531fc27b(repo)\r\n428ca84f627c695362652cc7531fc27b --> 69a9852570720a3d35cb9dd52a281f71\r\n0cd9eb1ffb3c56d2b0a4359f800b1f20(result)\r\n69a9852570720a3d35cb9dd52a281f71 --> 0cd9eb1ffb3c56d2b0a4359f800b1f20\r\nend\r\nsubgraph 8506cba6514466fb6d65f33ace4b0eac[alice_contribute_readme]\r\nstyle 8506cba6514466fb6d65f33ace4b0eac fill:#fff4de,stroke:#cece71\r\nd4507d3d1c3fbf3e7e373eae24797667[alice_contribute_readme]\r\n68cf7d6869d027ca46a5fb4dbf7001d1(repo)\r\n68cf7d6869d027ca46a5fb4dbf7001d1 --> d4507d3d1c3fbf3e7e373eae24797667\r\n2f9316539862f119f7c525bf9061e974(result)\r\nd4507d3d1c3fbf3e7e373eae24797667 --> 2f9316539862f119f7c525bf9061e974\r\nend\r\nsubgraph 4233e6dc67cba131d4ef005af9c02959[contribute_readme_md]\r\nstyle 4233e6dc67cba131d4ef005af9c02959 fill:#fff4de,stroke:#cece71\r\n3db0ee5d6ab83886bded5afd86f3f88f[contribute_readme_md]\r\n37044e4d8610abe13849bc71a5cb7591(base)\r\n37044e4d8610abe13849bc71a5cb7591 --> 3db0ee5d6ab83886bded5afd86f3f88f\r\n631c051fe6050ae8f8fc3321ed00802d(commit_message)\r\n631c051fe6050ae8f8fc3321ed00802d --> 3db0ee5d6ab83886bded5afd86f3f88f\r\n182194bab776fc9bc406ed573d621b68(repo)\r\n182194bab776fc9bc406ed573d621b68 --> 3db0ee5d6ab83886bded5afd86f3f88f\r\n0ee9f524d2db12be854fe611fa8126dd(result)\r\n3db0ee5d6ab83886bded5afd86f3f88f --> 0ee9f524d2db12be854fe611fa8126dd\r\nend\r\nsubgraph a6080d9c45eb5f806a47152a18bf7830[create_readme_file_if_not_exists]\r\nstyle a6080d9c45eb5f806a47152a18bf7830 fill:#fff4de,stroke:#cece71\r\n67e388f508dd96084c37d236a2c67e67[create_readme_file_if_not_exists]\r\n54faf20bfdca0e63d07efb3e5a984cf1(readme_contents)\r\n54faf20bfdca0e63d07efb3e5a984cf1 --> 67e388f508dd96084c37d236a2c67e67\r\n8c089c362960ccf181742334a3dccaea(repo)\r\n8c089c362960ccf181742334a3dccaea --> 67e388f508dd96084c37d236a2c67e67\r\n5cc65e17d40e6a7223c1504f1c4b0d2a(result)\r\n67e388f508dd96084c37d236a2c67e67 --> 5cc65e17d40e6a7223c1504f1c4b0d2a\r\nend\r\nsubgraph e7757158127e9845b2915c16a7fa80c5[readme_commit_message]\r\nstyle e7757158127e9845b2915c16a7fa80c5 fill:#fff4de,stroke:#cece71\r\n562bdc535c7cebfc66dba920b1a17540[readme_commit_message]\r\n0af5cbea9050874a0a3cba73bb61f892(issue_url)\r\n0af5cbea9050874a0a3cba73bb61f892 --> 562bdc535c7cebfc66dba920b1a17540\r\n2641f3b67327fb7518ee34a3a40b0755(result)\r\n562bdc535c7cebfc66dba920b1a17540 --> 2641f3b67327fb7518ee34a3a40b0755\r\nend\r\nsubgraph cf99ff6fad80e9c21266b43fd67b2f7b[readme_issue]\r\nstyle cf99ff6fad80e9c21266b43fd67b2f7b fill:#fff4de,stroke:#cece71\r\nda44417f891a945085590baafffc2bdb[readme_issue]\r\nd519830ab4e07ec391038e8581889ac3(body)\r\nd519830ab4e07ec391038e8581889ac3 --> da44417f891a945085590baafffc2bdb\r\n268852aa3fa8ab0864a32abae5a333f7(repo)\r\n268852aa3fa8ab0864a32abae5a333f7 --> da44417f891a945085590baafffc2bdb\r\n77a11dd29af309cf43ed321446c4bf01(title)\r\n77a11dd29af309cf43ed321446c4bf01 --> da44417f891a945085590baafffc2bdb\r\n1d2360c9da18fac0b6ec142df8f3fbda(result)\r\nda44417f891a945085590baafffc2bdb --> 1d2360c9da18fac0b6ec142df8f3fbda\r\nend\r\nsubgraph 7ec0442cf2d95c367912e8abee09b217[readme_pr]\r\nstyle 7ec0442cf2d95c367912e8abee09b217 fill:#fff4de,stroke:#cece71\r\nbb314dc452cde5b6af5ea94dd277ba40[readme_pr]\r\n127d77c3047facc1daa621148c5a0a1d(base)\r\n127d77c3047facc1daa621148c5a0a1d --> bb314dc452cde5b6af5ea94dd277ba40\r\ncb421e4de153cbb912f7fbe57e4ad734(body)\r\ncb421e4de153cbb912f7fbe57e4ad734 --> bb314dc452cde5b6af5ea94dd277ba40\r\ncbf7a0b88c0a41953b245303f3e9a0d3(head)\r\ncbf7a0b88c0a41953b245303f3e9a0d3 --> bb314dc452cde5b6af5ea94dd277ba40\r\ne5f9ad44448abd2469b3fd9831f3d159(origin)\r\ne5f9ad44448abd2469b3fd9831f3d159 --> bb314dc452cde5b6af5ea94dd277ba40\r\na35aee6711d240378eb57a3932537ca1(repo)\r\na35aee6711d240378eb57a3932537ca1 --> bb314dc452cde5b6af5ea94dd277ba40\r\ndfcce88a7d605d46bf17de1159fbe5ad(title)\r\ndfcce88a7d605d46bf17de1159fbe5ad --> bb314dc452cde5b6af5ea94dd277ba40\r\na210a7890a7bea8d629368e02da3d806(result)\r\nbb314dc452cde5b6af5ea94dd277ba40 --> a210a7890a7bea8d629368e02da3d806\r\nend\r\nsubgraph 227eabb1f1c5cc0bc931714a03049e27[readme_pr_body]\r\nstyle 227eabb1f1c5cc0bc931714a03049e27 fill:#fff4de,stroke:#cece71\r\n2aea976396cfe68dacd9bc7d4a3f0cba[readme_pr_body]\r\nc5dfd309617c909b852afe0b4ae4a178(readme_issue)\r\nc5dfd309617c909b852afe0b4ae4a178 --> 2aea976396cfe68dacd9bc7d4a3f0cba\r\n40ddb5b508cb5643e7c91f7abdb72b84(result)\r\n2aea976396cfe68dacd9bc7d4a3f0cba --> 40ddb5b508cb5643e7c91f7abdb72b84\r\nend\r\nsubgraph 48687c84e69b3db0acca625cbe2e6b49[readme_pr_title]\r\nstyle 48687c84e69b3db0acca625cbe2e6b49 fill:#fff4de,stroke:#cece71\r\nd8668ff93f41bc241c8c540199cd7453[readme_pr_title]\r\n3b2137dd1c61d0dac7d4e40fd6746cfb(readme_issue)\r\n3b2137dd1c61d0dac7d4e40fd6746cfb --> d8668ff93f41bc241c8c540199cd7453\r\n956e024fde513b3a449eac9ee42d6ab3(result)\r\nd8668ff93f41bc241c8c540199cd7453 --> 956e024fde513b3a449eac9ee42d6ab3\r\nend\r\nsubgraph d3ec0ac85209a7256c89d20f758f09f4[check_if_valid_git_repository_URL]\r\nstyle d3ec0ac85209a7256c89d20f758f09f4 fill:#fff4de,stroke:#cece71\r\nf577c71443f6b04596b3fe0511326c40[check_if_valid_git_repository_URL]\r\n7440e73a8e8f864097f42162b74f2762(URL)\r\n7440e73a8e8f864097f42162b74f2762 --> f577c71443f6b04596b3fe0511326c40\r\n8e39b501b41c5d0e4596318f80a03210(valid)\r\nf577c71443f6b04596b3fe0511326c40 --> 8e39b501b41c5d0e4596318f80a03210\r\nend\r\nsubgraph af8da22d1318d911f29b95e687f87c5d[clone_git_repo]\r\nstyle af8da22d1318d911f29b95e687f87c5d fill:#fff4de,stroke:#cece71\r\n155b8fdb5524f6bfd5adbae4940ad8d5[clone_git_repo]\r\need77b9eea541e0c378c67395351099c(URL)\r\need77b9eea541e0c378c67395351099c --> 155b8fdb5524f6bfd5adbae4940ad8d5\r\n8b5928cd265dd2c44d67d076f60c8b05(ssh_key)\r\n8b5928cd265dd2c44d67d076f60c8b05 --> 155b8fdb5524f6bfd5adbae4940ad8d5\r\n4e1d5ea96e050e46ebf95ebc0713d54c(repo)\r\n155b8fdb5524f6bfd5adbae4940ad8d5 --> 4e1d5ea96e050e46ebf95ebc0713d54c\r\n6a44de06a4a3518b939b27c790f6cdce{valid_git_repository_URL}\r\n6a44de06a4a3518b939b27c790f6cdce --> 155b8fdb5524f6bfd5adbae4940ad8d5\r\nend\r\nsubgraph d3d91578caf34c0ae944b17853783406[git_repo_default_branch]\r\nstyle d3d91578caf34c0ae944b17853783406 fill:#fff4de,stroke:#cece71\r\n546062a96122df465d2631f31df4e9e3[git_repo_default_branch]\r\n181f1b33df4d795fbad2911ec7087e86(repo)\r\n181f1b33df4d795fbad2911ec7087e86 --> 546062a96122df465d2631f31df4e9e3\r\n57651c1bcd24b794dfc8d1794ab556d5(branch)\r\n546062a96122df465d2631f31df4e9e3 --> 57651c1bcd24b794dfc8d1794ab556d5\r\n5ed1ab77e726d7efdcc41e9e2f8039c6(remote)\r\n546062a96122df465d2631f31df4e9e3 --> 5ed1ab77e726d7efdcc41e9e2f8039c6\r\n4c3cdd5f15b7a846d291aac089e8a622{no_git_branch_given}\r\n4c3cdd5f15b7a846d291aac089e8a622 --> 546062a96122df465d2631f31df4e9e3\r\nend\r\nend\r\nsubgraph a4827add25f5c7d5895c5728b74e2beb[Cleanup Stage]\r\nstyle a4827add25f5c7d5895c5728b74e2beb fill:#afd388b5,stroke:#a4ca7a\r\nend\r\nsubgraph 58ca4d24d2767176f196436c2890b926[Output Stage]\r\nstyle 58ca4d24d2767176f196436c2890b926 fill:#afd388b5,stroke:#a4ca7a\r\nend\r\nsubgraph inputs[Inputs]\r\nstyle inputs fill:#f6dbf9,stroke:#a178ca\r\n128516cfa09b0383023eab52ee24878a(seed<br>dffml.util.cli.CMD)\r\n128516cfa09b0383023eab52ee24878a --> e07552ee3b6b7696cb3ddd786222eaad\r\nba29b52e9c5aa88ea1caeeff29bfd491 --> cee6b5fdd0b6fbd0539cdcdc7f5a3324\r\n128516cfa09b0383023eab52ee24878a(seed<br>dffml.util.cli.CMD)\r\n128516cfa09b0383023eab52ee24878a --> 330f463830aa97e88917d5a9d1c21500\r\n128516cfa09b0383023eab52ee24878a(seed<br>dffml.util.cli.CMD)\r\n128516cfa09b0383023eab52ee24878a --> dc7c5f0836f7d2564c402bf956722672\r\nba29b52e9c5aa88ea1caeeff29bfd491 --> 58d8518cb0d6ef6ad35dc242486f1beb\r\n79e1ea6822bff603a835fb8ee80c7ff3 --> e824ae072860bc545fc7d55aa0bca479\r\n135ee61e3402d6fcbd7a219b0b4ccd73 --> e824ae072860bc545fc7d55aa0bca479\r\n40109d487bb9f08608d8c5f6e747042f --> 33d806f9b732bfd6b96ae2e9e4243a68\r\n21ccfd2c550bd853d28581f0b0c9f9fe(seed<br>default.branch.name)\r\n21ccfd2c550bd853d28581f0b0c9f9fe --> fdcb9b6113856222e30e093f7c38065e\r\ndd5aab190ce844673819298c5b8fde76 --> bdcf4b078985f4a390e4ed4beacffa65\r\n9b92d5a346885079a2821c4d27cb5174 --> bdcf4b078985f4a390e4ed4beacffa65\r\n5a5493ab86ab4053f1d44302e7bdddd6 --> ff47cf65b58262acec28507f4427de45\r\n57651c1bcd24b794dfc8d1794ab556d5 --> ff47cf65b58262acec28507f4427de45\r\n4e1d5ea96e050e46ebf95ebc0713d54c --> e58180baf478fe910359358a3fa02234\r\n40109d487bb9f08608d8c5f6e747042f --> c3bfe79b396a98ce2d9bfe772c9c20af\r\n2a1c620b0d510c3d8ed35deda41851c5 --> 4934c6211334318c63a5e91530171c9b\r\n2a1c620b0d510c3d8ed35deda41851c5 --> 5567dd8a6d7ae4fe86252db32e189a4d\r\n5ed1ab77e726d7efdcc41e9e2f8039c6 --> 6c2b36393ffff6be0b4ad333df2d9419\r\ndd5aab190ce844673819298c5b8fde76 --> 19a9ee483c1743e6ecf0a2dc3b6f8c7a\r\n9b92d5a346885079a2821c4d27cb5174 --> 19a9ee483c1743e6ecf0a2dc3b6f8c7a\r\n0cd9eb1ffb3c56d2b0a4359f800b1f20 --> 1d79010ee1550f057c531130814c40b9\r\ndd5aab190ce844673819298c5b8fde76 --> 712d4318e59bd2dc629f0ddebb257ca3\r\n9b92d5a346885079a2821c4d27cb5174 --> 712d4318e59bd2dc629f0ddebb257ca3\r\ne7ad3469d98c3bd160363dbc47e2d741(seed<br>MetaIssueTitle)\r\ne7ad3469d98c3bd160363dbc47e2d741 --> 38a94f1c2162803f571489d707d61021\r\n150204cd2d5a921deb53c312418379a1 --> 480d1cc478d23858e92d61225349b674\r\n1d2360c9da18fac0b6ec142df8f3fbda --> 37035ea5a06a282bdc1e1de24090a36d\r\n5cc65e17d40e6a7223c1504f1c4b0d2a --> fdf0dbb8ca47ee9022b3daeb8c7df9c0\r\ndd5aab190ce844673819298c5b8fde76 --> 428ca84f627c695362652cc7531fc27b\r\n9b92d5a346885079a2821c4d27cb5174 --> 428ca84f627c695362652cc7531fc27b\r\ndd5aab190ce844673819298c5b8fde76 --> 68cf7d6869d027ca46a5fb4dbf7001d1\r\n9b92d5a346885079a2821c4d27cb5174 --> 68cf7d6869d027ca46a5fb4dbf7001d1\r\n150204cd2d5a921deb53c312418379a1 --> 37044e4d8610abe13849bc71a5cb7591\r\n2641f3b67327fb7518ee34a3a40b0755 --> 631c051fe6050ae8f8fc3321ed00802d\r\n2f9316539862f119f7c525bf9061e974 --> 182194bab776fc9bc406ed573d621b68\r\nd2708225c1f4c95d613a2645a17a5bc0(seed<br>repo.directory.readme.contents)\r\nd2708225c1f4c95d613a2645a17a5bc0 --> 54faf20bfdca0e63d07efb3e5a984cf1\r\n2f9316539862f119f7c525bf9061e974 --> 8c089c362960ccf181742334a3dccaea\r\n1d2360c9da18fac0b6ec142df8f3fbda --> 0af5cbea9050874a0a3cba73bb61f892\r\n1daacccd02f8117e67ad3cb8686a732c(seed<br>ReadmeIssueBody)\r\n1daacccd02f8117e67ad3cb8686a732c --> d519830ab4e07ec391038e8581889ac3\r\n2f9316539862f119f7c525bf9061e974 --> 268852aa3fa8ab0864a32abae5a333f7\r\n0c1ab2d4bda10e1083557833ae5c5da4(seed<br>ReadmeIssueTitle)\r\n0c1ab2d4bda10e1083557833ae5c5da4 --> 77a11dd29af309cf43ed321446c4bf01\r\n150204cd2d5a921deb53c312418379a1 --> 127d77c3047facc1daa621148c5a0a1d\r\n40ddb5b508cb5643e7c91f7abdb72b84 --> cb421e4de153cbb912f7fbe57e4ad734\r\n0ee9f524d2db12be854fe611fa8126dd --> cbf7a0b88c0a41953b245303f3e9a0d3\r\nb4cff8d194413f436d94f9d84ece0262 --> e5f9ad44448abd2469b3fd9831f3d159\r\n2f9316539862f119f7c525bf9061e974 --> a35aee6711d240378eb57a3932537ca1\r\n956e024fde513b3a449eac9ee42d6ab3 --> dfcce88a7d605d46bf17de1159fbe5ad\r\n1d2360c9da18fac0b6ec142df8f3fbda --> c5dfd309617c909b852afe0b4ae4a178\r\n1d2360c9da18fac0b6ec142df8f3fbda --> 3b2137dd1c61d0dac7d4e40fd6746cfb\r\n8d0adc31da1a0919724baf73d047743c --> 7440e73a8e8f864097f42162b74f2762\r\n8d0adc31da1a0919724baf73d047743c --> eed77b9eea541e0c378c67395351099c\r\na6ed501edbf561fda49a0a0a3ca310f0(seed<br>git_repo_ssh_key)\r\na6ed501edbf561fda49a0a0a3ca310f0 --> 8b5928cd265dd2c44d67d076f60c8b05\r\n8e39b501b41c5d0e4596318f80a03210 --> 6a44de06a4a3518b939b27c790f6cdce\r\n4e1d5ea96e050e46ebf95ebc0713d54c --> 181f1b33df4d795fbad2911ec7087e86\r\nend\r\n```\r\n\r\n- As of f8619a6362251d04929f4bfa395882b3257a3776 it works without meta issue\r\n creation: https://github.com/pdxjohnny/testaaaa/pull/193\r\n\r\n# 45\r\n\r\n```console\r\n$ gif-for-cli --rows $(tput lines) --cols $(tput cols) --export=/mnt/c/Users/Johnny/Downloads/alice-search-alices-adventures-in-wonderland-1.gif \"Alice's Adventures in Wonderland\"\r\n```\r\n\r\n```console\r\n$ watch -n 0.2 'grep FEEDFACE .output/$(ls .output/ | tail -n 1) | sed -e \"s/alice.please.contribute.recommended_community_standards.recommended_community_standards.//g\" | grep -i repo'\r\n```"
}
]
},
{
"body": "# 2022-07-29 Engineering Logs\r\n\r\n- Alice PR: https://github.com/intel/dffml/pull/1401\r\n- John's last day before sabbatical\r\n - He will be in town but offline until 2022-08-29\r\n- Rolling Alice: 2022 Progress Reports: July Activities Recap: https://youtu.be/JDh2DARl8os\r\n- Alice is ready for contribution\r\n - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0001_coach_alice/0002_our_open_source_guide.md\r\n - https://github.com/intel/dffml/blob/alice/entities/alice/CONTRIBUTING.rst\r\n - Self fulfilling prophecy again! We can even automate our contributions to her even if we wanted to! She will eventually! :P\r\n- IETF\r\n - Joined SCITT WG, will rejoin in September, others please do as well!\r\n- OpenSSF\r\n - Aligned with Identifying Security Threats WG on SCITT looking like a solid direction to cross with Stream 8 for 1-2 year timeframe as Web5 space matures.\r\n- Graphics to help people get involved\r\n - https://drive.google.com/drive/folders/1E8tZT15DNjd13jVR6xqsblgLvwTZZo_f",
"replies": [
{
"body": "## 2022-07-29 @pdxjohnny Engineering Logs\r\n\r\n- AppSec PNW 2022 Talk playlist: https://youtube.com/playlist?list=PLfoJYLR9vr_IAd1vYWdKCOO4YYpGFVv99\r\n - John^2: Living Threat Models are Better Than Dead Threat Models\r\n - Not yet uploaded but has Alice's first live demo\r\n- https://towardsdatascience.com/installing-multiple-alternative-versions-of-python-on-ubuntu-20-04-237be5177474\r\n - `$ sudo update-alternatives --install /usr/bin/python python /usr/bin/python3 40`\r\n- References\r\n - https://tenor.com/search/alice-gifs\r\n - https://tenor.com/view/why-thank-you-thanks-bow-thank-you-alice-in-wonderland-gif-3553903\r\n - Alice curtsy\r\n - https://tenor.com/view/alice-in-wonderland-gif-26127117\r\n - Alice blows out unbirthday cake candle\r\n\r\n```console\r\n$ alice; sleep 3; gif-for-cli -l 0 --rows $(tput lines) --cols $(tput cols) 3553903\r\n```\r\n\r\n```console\r\n$ gif-for-cli --rows `tput lines` --cols `tput cols` --export=alice-search-alices-adventures-in-wonderland-1.gif \"Alice curtsy\"\r\n(why-thank-you-thanks-bow-thank-you-alice-in-wonderland-gif-3553903)\r\n$ gif-for-cli --rows `tput lines` --cols `tput cols` --export=ascii-gif-alice-unbirthday-blow-out-candles-0.gif 26127117\r\n$ gif-for-cli --rows `tput lines` --cols `tput cols` ascii-gif-alice-unbirthday-blow-out-candles-0.gif\r\n$ echo gif-for-cli --rows `tput lines` --cols `tput cols`\r\ngif-for-cli --rows 97 --cols 320\r\n$ gif-for-cli -l 0 --rows `tput lines` --cols `tput cols` /mnt/c/Users/Johnny/Downloads/ascii-alices-adventures-in-wonderland-1.gif`\r\n```\r\n\r\n### Exploring a Helper Around Run DataFlow run_custom\r\n\r\n- Realized we already have the lock because it's on `git_repository` at `flow_depth=1`\r\n\r\n```diff\r\ndiff --git a/dffml/df/base.py b/dffml/df/base.py\r\nindex 4f84c1c7c8..2da0512602 100644\r\n--- a/dffml/df/base.py\r\n+++ b/dffml/df/base.py\r\n@@ -404,14 +404,19 @@ def op(\r\n )\r\n \r\n definition_name = \".\".join(name_list)\r\n+ print(\"FEEDFACE\", name, definition_name)\r\n if hasattr(param_annotation, \"__supertype__\") and hasattr(\r\n param_annotation, \"__name__\"\r\n ):\r\n+ if \"repo\" in definition_name:\r\n+ breakpoint()\r\n definition_name = param_annotation.__name__\r\n+ print(\"FEEDFACE\", name, definition_name)\r\n if inspect.isclass(param_annotation) and hasattr(\r\n param_annotation, \"__qualname__\"\r\n ):\r\n definition_name = param_annotation.__qualname__\r\n+ print(\"FEEDFACE\", name, definition_name)\r\n \r\n if isinstance(param_annotation, Definition):\r\n kwargs[\"inputs\"][name] = param_annotation\r\ndiff --git a/dffml/df/types.py b/dffml/df/types.py\r\nindex f09a8a3cea..54840f58c0 100644\r\n--- a/dffml/df/types.py\r\n+++ b/dffml/df/types.py\r\n@@ -44,6 +44,7 @@ APPLY_INSTALLED_OVERLAYS = _APPLY_INSTALLED_OVERLAYS()\r\n \r\n \r\n Expand = Union\r\n+LockReadWrite = Union\r\n \r\n \r\n primitive_types = (int, float, str, bool, dict, list, bytes)\r\n@@ -65,7 +66,7 @@ def find_primitive(new_type: Type) -> Type:\r\n )\r\n \r\n \r\n-def new_type_to_defininition(new_type: Type) -> Type:\r\n+def new_type_to_defininition(new_type: Type, lock: bool = False) -> Type:\r\n \"\"\"\r\n >>> from typing import NewType\r\n >>> from dffml import new_type_to_defininition\r\n@@ -77,6 +78,7 @@ def new_type_to_defininition(new_type: Type) -> Type:\r\n return Definition(\r\n name=new_type.__name__,\r\n primitive=find_primitive(new_type).__qualname__,\r\n+ lock=lock,\r\n links=(\r\n create_definition(\r\n find_primitive(new_type).__qualname__, new_type.__supertype__\r\n@@ -95,7 +97,28 @@ class CouldNotDeterminePrimitive(Exception):\r\n \"\"\"\r\n \r\n \r\n-def resolve_if_forward_ref(param_annotation, forward_refs_from_cls):\r\n+DEFAULT_DEFINTION_ANNOTATIONS_HANDLERS = {\r\n+ LockReadWrite: lambda definition: setattr(definition, \"lock\", True),\r\n+}\r\n+\r\n+\r\n+def resolve_if_forward_ref(\r\n+ param_annotation,\r\n+ forward_refs_from_cls,\r\n+ *,\r\n+ defintion_annotations_handlers=None,\r\n+) -> Tuple[Union[\"Definition\", Any], bool]:\r\n+ \"\"\"\r\n+ Return values:\r\n+\r\n+ param_or_definition: Union[Definition, Any]\r\n+ lock: bool\r\n+\r\n+ If the definition should be locked or not.\r\n+ \"\"\"\r\n+ if defintion_annotations_handlers is None:\r\n+ defintion_annotations_handlers = DEFAULT_DEFINTION_ANNOTATIONS_HANDLERS\r\n+ annotations = {}\r\n if isinstance(param_annotation, ForwardRef):\r\n param_annotation = param_annotation.__forward_arg__\r\n if (\r\n@@ -104,11 +127,22 @@ def resolve_if_forward_ref(param_annotation, forward_refs_from_cls):\r\n and hasattr(forward_refs_from_cls, param_annotation)\r\n ):\r\n param_annotation = getattr(forward_refs_from_cls, param_annotation)\r\n+ # Check if are in an annotation\r\n+ param_annotation_origin = get_origin(param_annotation)\r\n+ if param_annotation_origin in defintion_annotations_handlers:\r\n+ annotations[\r\n+ param_annotation_origin\r\n+ ] = defintion_annotations_handlers[param_annotation_origin]\r\n+ param_annotation = list(get_args(param_annotation))[0]\r\n+ # Create definition\r\n if hasattr(param_annotation, \"__name__\") and hasattr(\r\n param_annotation, \"__supertype__\"\r\n ):\r\n # typing.NewType support\r\n- return new_type_to_defininition(param_annotation)\r\n+ definition = new_type_to_defininition(param_annotation)\r\n+ for handler in annotations.values():\r\n+ handler(definition)\r\n+ return definition\r\n return param_annotation\r\n \r\n \r\n@@ -118,6 +152,7 @@ def _create_definition(\r\n default=NO_DEFAULT,\r\n *,\r\n forward_refs_from_cls: Optional[object] = None,\r\n+ lock: bool = False,\r\n ):\r\n param_annotation = resolve_if_forward_ref(\r\n param_annotation, forward_refs_from_cls\r\n@@ -138,12 +173,14 @@ def _create_definition(\r\n elif get_origin(param_annotation) in [\r\n Union,\r\n collections.abc.AsyncIterator,\r\n+ LockReadWrite,\r\n ]:\r\n # If the annotation is of the form Optional\r\n return create_definition(\r\n name,\r\n list(get_args(param_annotation))[0],\r\n forward_refs_from_cls=forward_refs_from_cls,\r\n+ lock=bool(get_origin(param_annotation) in (LockReadWrite,),),\r\n )\r\n elif (\r\n get_origin(param_annotation) is list\r\n@@ -235,6 +272,7 @@ def create_definition(\r\n default=NO_DEFAULT,\r\n *,\r\n forward_refs_from_cls: Optional[object] = None,\r\n+ lock: bool = False,\r\n ):\r\n if hasattr(param_annotation, \"__name__\") and hasattr(\r\n param_annotation, \"__supertype__\"\r\n@@ -246,6 +284,7 @@ def create_definition(\r\n param_annotation,\r\n default=default,\r\n forward_refs_from_cls=forward_refs_from_cls,\r\n+ lock=lock,\r\n )\r\n # We can guess name if converting from NewType. However, we can't otherwise.\r\n if not definition.name:\r\n@@ -847,7 +886,9 @@ class DataFlow:\r\n for operation in args:\r\n name = getattr(getattr(operation, \"op\", operation), \"name\")\r\n if name in operations:\r\n- raise ValueError(f\"Operation {name} given as positional and in dict\")\r\n+ raise ValueError(\r\n+ f\"Operation {name} given as positional and in dict\"\r\n+ )\r\n operations[name] = operation\r\n \r\n self.operations = operations\r\ndiff --git a/entities/alice/alice/please/contribute/recommended_community_standards/recommended_community_standards.py b/entities/alice/alice/please/contribute/recommended_community_standards/recommended_community_standards.py\r\nindex 825f949d65..0ff7e11c31 100644\r\n--- a/entities/alice/alice/please/contribute/recommended_community_standards/recommended_community_standards.py\r\n+++ b/entities/alice/alice/please/contribute/recommended_community_standards/recommended_community_standards.py\r\n@@ -8,18 +8,21 @@ import dffml\r\n import dffml_feature_git.feature.definitions\r\n \r\n \r\n-class AliceGitRepo(NamedTuple):\r\n+class AliceGitRepoSpec(NamedTuple):\r\n directory: str\r\n URL: str\r\n \r\n \r\n+AliceGitRepo = dffml.LockReadWrite[AliceGitRepoSpec]\r\n+\r\n+\r\n class AliceGitRepoInputSetContextHandle(dffml.BaseContextHandle):\r\n def as_string(self) -> str:\r\n return str(self.ctx.repo)\r\n \r\n \r\n class AliceGitRepoInputSetContext(dffml.BaseInputSetContext):\r\n- def __init__(self, repo: AliceGitRepo):\r\n+ def __init__(self, repo: AliceGitRepoSpec):\r\n self.repo = repo\r\n \r\n async def handle(self) -> AliceGitRepoInputSetContextHandle:\r\n```\r\n\r\n- Is this the same as what we had in c89d3d8444cdad248fce5a7fff959c9ea48a7c9d ?\r\n\r\n```python\r\n async def alice_contribute_readme(self, repo: AliceGitRepo) -> ReadmeGitRepo:\r\n key, definition = list(self.parent.op.outputs.items())[0]\r\n await self.octx.ictx.cadd(\r\n AliceGitRepoInputSetContext(repo),\r\n dffml.Input(\r\n value=repo,\r\n definition=definition,\r\n parents=None,\r\n origin=(self.parent.op.instance_name, key),\r\n )\r\n )\r\n```\r\n\r\n```diff\r\ndiff --git a/entities/alice/alice/please/contribute/recommended_community_standards/recommended_community_standards.py b/entities/alice/alice/please/contribute/recommended_community_standards/recommended_community_standards.py\r\nindex 825f949d65..1bc1c41e50 100644\r\n--- a/entities/alice/alice/please/contribute/recommended_community_standards/recommended_community_standards.py\r\n+++ b/entities/alice/alice/please/contribute/recommended_community_standards/recommended_community_standards.py\r\n@@ -203,30 +203,22 @@ class OverlayREADME:\r\n ReadmePRBody = NewType(\"github.pr.body\", str)\r\n\r\n # async def cli_run_on_repo(self, repo: \"CLIRunOnRepo\"):\r\n- async def alice_contribute_readme(self, repo: AliceGitRepo) -> ReadmeGitRepo:\r\n- # TODO Clean this up once SystemContext refactor complete\r\n- readme_dataflow_cls_upstream = OverlayREADME\r\n- readme_dataflow_cls_overlays = dffml.Overlay.load(\r\n- entrypoint=\"dffml.overlays.alice.please.contribute.recommended_community_standards.overlay.readme\"\r\n- )\r\n- readme_dataflow_upstream = dffml.DataFlow(\r\n- *dffml.object_to_operations(readme_dataflow_cls_upstream)\r\n- )\r\n+ async def new_context(self, repo: AliceGitRepo) -> ReadmeGitRepo:\r\n+ return\r\n # auto_flow with overlays\r\n- readme_dataflow = dffml.DataFlow(\r\n+ dataflow = dffml.DataFlow(\r\n *itertools.chain(\r\n *[\r\n dffml.object_to_operations(cls)\r\n for cls in [\r\n- readme_dataflow_cls_upstream,\r\n- *readme_dataflow_cls_overlays,\r\n+ upstream,\r\n+ *overlays,\r\n ]\r\n ]\r\n )\r\n )\r\n async with dffml.run_dataflow.imp(\r\n- # dataflow=self.octx.config.dataflow,\r\n- dataflow=readme_dataflow,\r\n+ dataflow=dataflow,\r\n input_set_context_cls=AliceGitRepoInputSetContext,\r\n ) as custom_run_dataflow:\r\n # Copy all inputs from parent context into child. We eventually\r\n@@ -277,6 +269,18 @@ class OverlayREADME:\r\n },\r\n )\r\n\r\n+ async def alice_contribute_readme(self, repo: AliceGitRepo) -> ReadmeGitRepo:\r\n+ key, definition = list(self.parent.op.outputs.items())[0]\r\n+ await self.octx.ictx.cadd(\r\n+ AliceGitRepoInputSetContext(repo),\r\n+ dffml.Input(\r\n+ value=repo,\r\n+ definition=definition,\r\n+ parents=None,\r\n+ origin=(self.parent.op.instance_name, key),\r\n+ )\r\n+ )\r\n+\r\n # TODO Run this system context where readme contexts is given on CLI or\r\n # overriden via disabling of static overlay and application of overlay to\r\n # generate contents dynamiclly.\r\n```\r\n\r\n- Visualize the flow before we attempt to add `CONTRIBUTING.md` contribution\r\n\r\n```console\r\n$ dffml service dev export alice.cli:AlicePleaseContributeCLIDataFlow | tee alice.please.contribute.recommended_community_standards.json\r\n$ (echo -e 'HTTP/1.0 200 OK\\n' && dffml dataflow diagram -shortname alice.please.contribute.recommended_community_standards.json) | nc -Nlp 9999;\r\n```\r\n\r\n```mermaid\r\ngraph TD\r\nsubgraph a759a07029077edc5c37fea0326fa281[Processing Stage]\r\nstyle a759a07029077edc5c37fea0326fa281 fill:#afd388b5,stroke:#a4ca7a\r\nsubgraph 8cfb8cd5b8620de4a7ebe0dfec00771a[cli_has_repos]\r\nstyle 8cfb8cd5b8620de4a7ebe0dfec00771a fill:#fff4de,stroke:#cece71\r\nd493c90433d19f11f33c2d72cd144940[cli_has_repos]\r\ne07552ee3b6b7696cb3ddd786222eaad(cmd)\r\ne07552ee3b6b7696cb3ddd786222eaad --> d493c90433d19f11f33c2d72cd144940\r\ncee6b5fdd0b6fbd0539cdcdc7f5a3324(wanted)\r\ncee6b5fdd0b6fbd0539cdcdc7f5a3324 --> d493c90433d19f11f33c2d72cd144940\r\n79e1ea6822bff603a835fb8ee80c7ff3(result)\r\nd493c90433d19f11f33c2d72cd144940 --> 79e1ea6822bff603a835fb8ee80c7ff3\r\nend\r\nsubgraph 0c2b64320fb5666a034794bb2195ecf0[cli_is_asking_for_recommended_community_standards]\r\nstyle 0c2b64320fb5666a034794bb2195ecf0 fill:#fff4de,stroke:#cece71\r\n222ee6c0209f1f1b7a782bc1276868c7[cli_is_asking_for_recommended_community_standards]\r\n330f463830aa97e88917d5a9d1c21500(cmd)\r\n330f463830aa97e88917d5a9d1c21500 --> 222ee6c0209f1f1b7a782bc1276868c7\r\nba29b52e9c5aa88ea1caeeff29bfd491(result)\r\n222ee6c0209f1f1b7a782bc1276868c7 --> ba29b52e9c5aa88ea1caeeff29bfd491\r\nend\r\nsubgraph eac58e8db2b55cb9cc5474aaa402c93e[cli_is_meant_on_this_repo]\r\nstyle eac58e8db2b55cb9cc5474aaa402c93e fill:#fff4de,stroke:#cece71\r\n6c819ad0228b0e7094b33e0634da9a38[cli_is_meant_on_this_repo]\r\ndc7c5f0836f7d2564c402bf956722672(cmd)\r\ndc7c5f0836f7d2564c402bf956722672 --> 6c819ad0228b0e7094b33e0634da9a38\r\n58d8518cb0d6ef6ad35dc242486f1beb(wanted)\r\n58d8518cb0d6ef6ad35dc242486f1beb --> 6c819ad0228b0e7094b33e0634da9a38\r\n135ee61e3402d6fcbd7a219b0b4ccd73(result)\r\n6c819ad0228b0e7094b33e0634da9a38 --> 135ee61e3402d6fcbd7a219b0b4ccd73\r\nend\r\nsubgraph 37887bf260c5c8e9bd18038401008bbc[cli_run_on_repo]\r\nstyle 37887bf260c5c8e9bd18038401008bbc fill:#fff4de,stroke:#cece71\r\n9d1042f33352800e54d98c9c5a4223df[cli_run_on_repo]\r\ne824ae072860bc545fc7d55aa0bca479(repo)\r\ne824ae072860bc545fc7d55aa0bca479 --> 9d1042f33352800e54d98c9c5a4223df\r\n40109d487bb9f08608d8c5f6e747042f(result)\r\n9d1042f33352800e54d98c9c5a4223df --> 40109d487bb9f08608d8c5f6e747042f\r\nend\r\nsubgraph 66ecd0c1f2e08941c443ec9cd89ec589[guess_repo_string_is_directory]\r\nstyle 66ecd0c1f2e08941c443ec9cd89ec589 fill:#fff4de,stroke:#cece71\r\n737d719a0c348ff65456024ddbc530fe[guess_repo_string_is_directory]\r\n33d806f9b732bfd6b96ae2e9e4243a68(repo_string)\r\n33d806f9b732bfd6b96ae2e9e4243a68 --> 737d719a0c348ff65456024ddbc530fe\r\ndd5aab190ce844673819298c5b8fde76(result)\r\n737d719a0c348ff65456024ddbc530fe --> dd5aab190ce844673819298c5b8fde76\r\nend\r\nsubgraph 2bcd191634373f4b97ecb9546df23ee5[alice_contribute_contributing]\r\nstyle 2bcd191634373f4b97ecb9546df23ee5 fill:#fff4de,stroke:#cece71\r\na2541ce40b2e5453e8e919021011e5e4[alice_contribute_contributing]\r\n3786b4af914402320d260d077844620e(repo)\r\n3786b4af914402320d260d077844620e --> a2541ce40b2e5453e8e919021011e5e4\r\nda4270ecc44b6d9eed9809a560d24a28(result)\r\na2541ce40b2e5453e8e919021011e5e4 --> da4270ecc44b6d9eed9809a560d24a28\r\nend\r\nsubgraph 13b430e6b93de7e40957165687f8e593[contribute_contributing_md]\r\nstyle 13b430e6b93de7e40957165687f8e593 fill:#fff4de,stroke:#cece71\r\nff8f8968322872ccc3cf151d167e22a2[contribute_contributing_md]\r\n4f752ce18209f62ed749e88dd1f70266(base)\r\n4f752ce18209f62ed749e88dd1f70266 --> ff8f8968322872ccc3cf151d167e22a2\r\n2def8c6923c832adf33989b26c91295a(commit_message)\r\n2def8c6923c832adf33989b26c91295a --> ff8f8968322872ccc3cf151d167e22a2\r\nf5548fcbcec8745ddf04104fc78e83a3(repo)\r\nf5548fcbcec8745ddf04104fc78e83a3 --> ff8f8968322872ccc3cf151d167e22a2\r\n24292ae12efd27a227a0d6368ba01faa(result)\r\nff8f8968322872ccc3cf151d167e22a2 --> 24292ae12efd27a227a0d6368ba01faa\r\nend\r\nsubgraph 71a5f33f393735fa1cc91419b43db115[contributing_commit_message]\r\nstyle 71a5f33f393735fa1cc91419b43db115 fill:#fff4de,stroke:#cece71\r\nd034a42488583464e601bcaee619a539[contributing_commit_message]\r\nc0a0fa68a872adf890ed639e07ed5882(issue_url)\r\nc0a0fa68a872adf890ed639e07ed5882 --> d034a42488583464e601bcaee619a539\r\nce14ca2191f2b1c13c605b240e797255(result)\r\nd034a42488583464e601bcaee619a539 --> ce14ca2191f2b1c13c605b240e797255\r\nend\r\nsubgraph db8a1253cc59982323848f5e42c23c9d[contributing_issue]\r\nstyle db8a1253cc59982323848f5e42c23c9d fill:#fff4de,stroke:#cece71\r\nc39bd2cc88723432048c434fdd337eab[contributing_issue]\r\n821d21e8a69d1fa1757147e7e768f306(body)\r\n821d21e8a69d1fa1757147e7e768f306 --> c39bd2cc88723432048c434fdd337eab\r\n0581b90c76b0a4635a968682b060abff(repo)\r\n0581b90c76b0a4635a968682b060abff --> c39bd2cc88723432048c434fdd337eab\r\n809719538467f6d0bf18f7ae26f08d80(title)\r\n809719538467f6d0bf18f7ae26f08d80 --> c39bd2cc88723432048c434fdd337eab\r\nc9f2ea5a7f25b3ae9fbf5041be5fa071(result)\r\nc39bd2cc88723432048c434fdd337eab --> c9f2ea5a7f25b3ae9fbf5041be5fa071\r\nend\r\nsubgraph 1e6046d1a567bf390566b1b995df9dcf[contributing_pr]\r\nstyle 1e6046d1a567bf390566b1b995df9dcf fill:#fff4de,stroke:#cece71\r\n4ec1433342f2f12ab8c59efab20e7b06[contributing_pr]\r\nbb85c3467b05192c99a3954968c7a612(base)\r\nbb85c3467b05192c99a3954968c7a612 --> 4ec1433342f2f12ab8c59efab20e7b06\r\n77f6c1c6b7ee62881b49c289097dfbde(body)\r\n77f6c1c6b7ee62881b49c289097dfbde --> 4ec1433342f2f12ab8c59efab20e7b06\r\na0a2fabc65fe5601c7ea289124d04f70(head)\r\na0a2fabc65fe5601c7ea289124d04f70 --> 4ec1433342f2f12ab8c59efab20e7b06\r\ncf92708915b9f41cb490b991abd6c374(origin)\r\ncf92708915b9f41cb490b991abd6c374 --> 4ec1433342f2f12ab8c59efab20e7b06\r\n210ae36c85f3597c248e0b32da7661ae(repo)\r\n210ae36c85f3597c248e0b32da7661ae --> 4ec1433342f2f12ab8c59efab20e7b06\r\n1700dc637c25bd503077a2a1422142e2(title)\r\n1700dc637c25bd503077a2a1422142e2 --> 4ec1433342f2f12ab8c59efab20e7b06\r\n806e8c455d2bb7ad68112d2a7e16eed6(result)\r\n4ec1433342f2f12ab8c59efab20e7b06 --> 806e8c455d2bb7ad68112d2a7e16eed6\r\nend\r\nsubgraph 04c27c13241164ae88456c1377995897[contributing_pr_body]\r\nstyle 04c27c13241164ae88456c1377995897 fill:#fff4de,stroke:#cece71\r\na3cebe78451142664930d44ad4d7d181[contributing_pr_body]\r\n6118470d0158ef1a220fe7c7232e1b63(contributing_issue)\r\n6118470d0158ef1a220fe7c7232e1b63 --> a3cebe78451142664930d44ad4d7d181\r\n99a7dd1ae037153eef80e1dee51b9d2b(result)\r\na3cebe78451142664930d44ad4d7d181 --> 99a7dd1ae037153eef80e1dee51b9d2b\r\nend\r\nsubgraph 0d4627f8d8564b6c4ba33c12dcb58fc1[contributing_pr_title]\r\nstyle 0d4627f8d8564b6c4ba33c12dcb58fc1 fill:#fff4de,stroke:#cece71\r\nbfa172a9399604546048d60db0a36187[contributing_pr_title]\r\n0fd26f9166ccca10c68e9aefa9c15767(contributing_issue)\r\n0fd26f9166ccca10c68e9aefa9c15767 --> bfa172a9399604546048d60db0a36187\r\n77a2f9d4dfad5f520f1502e8ba70e47a(result)\r\nbfa172a9399604546048d60db0a36187 --> 77a2f9d4dfad5f520f1502e8ba70e47a\r\nend\r\nsubgraph c67b92ef6a2e025ca086bc2f89d9afbb[create_contributing_file_if_not_exists]\r\nstyle c67b92ef6a2e025ca086bc2f89d9afbb fill:#fff4de,stroke:#cece71\r\n993a1fe069a02a45ba3579b1902b2a36[create_contributing_file_if_not_exists]\r\n401c179bb30b24c2ca989c64d0b1cdc7(contributing_contents)\r\n401c179bb30b24c2ca989c64d0b1cdc7 --> 993a1fe069a02a45ba3579b1902b2a36\r\ndde78f81b1bdfe02c0a2bf6e51f65cb4(repo)\r\ndde78f81b1bdfe02c0a2bf6e51f65cb4 --> 993a1fe069a02a45ba3579b1902b2a36\r\ne5b8d158dc0ec476dbbd44549a981815(result)\r\n993a1fe069a02a45ba3579b1902b2a36 --> e5b8d158dc0ec476dbbd44549a981815\r\nend\r\nsubgraph 4ea6696419c4a0862a4f63ea1f60c751[create_branch_if_none_exists]\r\nstyle 4ea6696419c4a0862a4f63ea1f60c751 fill:#fff4de,stroke:#cece71\r\n502369b37882b300d6620d5b4020f5b2[create_branch_if_none_exists]\r\nfdcb9b6113856222e30e093f7c38065e(name)\r\nfdcb9b6113856222e30e093f7c38065e --> 502369b37882b300d6620d5b4020f5b2\r\nbdcf4b078985f4a390e4ed4beacffa65(repo)\r\nbdcf4b078985f4a390e4ed4beacffa65 --> 502369b37882b300d6620d5b4020f5b2\r\n5a5493ab86ab4053f1d44302e7bdddd6(result)\r\n502369b37882b300d6620d5b4020f5b2 --> 5a5493ab86ab4053f1d44302e7bdddd6\r\nend\r\nsubgraph b1d510183f6a4c3fde207a4656c72cb4[determin_base_branch]\r\nstyle b1d510183f6a4c3fde207a4656c72cb4 fill:#fff4de,stroke:#cece71\r\n476aecd4d4d712cda1879feba46ea109[determin_base_branch]\r\nff47cf65b58262acec28507f4427de45(default_branch)\r\nff47cf65b58262acec28507f4427de45 --> 476aecd4d4d712cda1879feba46ea109\r\n150204cd2d5a921deb53c312418379a1(result)\r\n476aecd4d4d712cda1879feba46ea109 --> 150204cd2d5a921deb53c312418379a1\r\nend\r\nsubgraph 2a08ff341f159c170b7fe017eaad2f18[git_repo_to_alice_git_repo]\r\nstyle 2a08ff341f159c170b7fe017eaad2f18 fill:#fff4de,stroke:#cece71\r\n7f74112f6d30c6289caa0a000e87edab[git_repo_to_alice_git_repo]\r\ne58180baf478fe910359358a3fa02234(repo)\r\ne58180baf478fe910359358a3fa02234 --> 7f74112f6d30c6289caa0a000e87edab\r\n9b92d5a346885079a2821c4d27cb5174(result)\r\n7f74112f6d30c6289caa0a000e87edab --> 9b92d5a346885079a2821c4d27cb5174\r\nend\r\nsubgraph b5d35aa8a8dcd28d22d47caad02676b0[guess_repo_string_is_url]\r\nstyle b5d35aa8a8dcd28d22d47caad02676b0 fill:#fff4de,stroke:#cece71\r\n0de074e71a32e30889b8bb400cf8db9f[guess_repo_string_is_url]\r\nc3bfe79b396a98ce2d9bfe772c9c20af(repo_string)\r\nc3bfe79b396a98ce2d9bfe772c9c20af --> 0de074e71a32e30889b8bb400cf8db9f\r\n2a1c620b0d510c3d8ed35deda41851c5(result)\r\n0de074e71a32e30889b8bb400cf8db9f --> 2a1c620b0d510c3d8ed35deda41851c5\r\nend\r\nsubgraph 60791520c6d124c0bf15e599132b0caf[guessed_repo_string_is_operations_git_url]\r\nstyle 60791520c6d124c0bf15e599132b0caf fill:#fff4de,stroke:#cece71\r\n102f173505d7b546236cdeff191369d4[guessed_repo_string_is_operations_git_url]\r\n4934c6211334318c63a5e91530171c9b(repo_url)\r\n4934c6211334318c63a5e91530171c9b --> 102f173505d7b546236cdeff191369d4\r\n8d0adc31da1a0919724baf73d047743c(result)\r\n102f173505d7b546236cdeff191369d4 --> 8d0adc31da1a0919724baf73d047743c\r\nend\r\nsubgraph f2c7b93622447999daab403713239ada[guessed_repo_string_means_no_git_branch_given]\r\nstyle f2c7b93622447999daab403713239ada fill:#fff4de,stroke:#cece71\r\nc8294a87e7aae8f7f9cb7f53e054fed5[guessed_repo_string_means_no_git_branch_given]\r\n5567dd8a6d7ae4fe86252db32e189a4d(repo_url)\r\n5567dd8a6d7ae4fe86252db32e189a4d --> c8294a87e7aae8f7f9cb7f53e054fed5\r\nd888e6b64b5e3496056088f14dab9894(result)\r\nc8294a87e7aae8f7f9cb7f53e054fed5 --> d888e6b64b5e3496056088f14dab9894\r\nend\r\nsubgraph 113addf4beee5305fdc79d2363608f9d[github_owns_remote]\r\nstyle 113addf4beee5305fdc79d2363608f9d fill:#fff4de,stroke:#cece71\r\n049b72b81b976fbb43607bfeeb0464c5[github_owns_remote]\r\n6c2b36393ffff6be0b4ad333df2d9419(remote)\r\n6c2b36393ffff6be0b4ad333df2d9419 --> 049b72b81b976fbb43607bfeeb0464c5\r\n19a9ee483c1743e6ecf0a2dc3b6f8c7a(repo)\r\n19a9ee483c1743e6ecf0a2dc3b6f8c7a --> 049b72b81b976fbb43607bfeeb0464c5\r\nb4cff8d194413f436d94f9d84ece0262(result)\r\n049b72b81b976fbb43607bfeeb0464c5 --> b4cff8d194413f436d94f9d84ece0262\r\nend\r\nsubgraph 8506cba6514466fb6d65f33ace4b0eac[alice_contribute_readme]\r\nstyle 8506cba6514466fb6d65f33ace4b0eac fill:#fff4de,stroke:#cece71\r\nd4507d3d1c3fbf3e7e373eae24797667[alice_contribute_readme]\r\n68cf7d6869d027ca46a5fb4dbf7001d1(repo)\r\n68cf7d6869d027ca46a5fb4dbf7001d1 --> d4507d3d1c3fbf3e7e373eae24797667\r\n2f9316539862f119f7c525bf9061e974(result)\r\nd4507d3d1c3fbf3e7e373eae24797667 --> 2f9316539862f119f7c525bf9061e974\r\nend\r\nsubgraph 4233e6dc67cba131d4ef005af9c02959[contribute_readme_md]\r\nstyle 4233e6dc67cba131d4ef005af9c02959 fill:#fff4de,stroke:#cece71\r\n3db0ee5d6ab83886bded5afd86f3f88f[contribute_readme_md]\r\n37044e4d8610abe13849bc71a5cb7591(base)\r\n37044e4d8610abe13849bc71a5cb7591 --> 3db0ee5d6ab83886bded5afd86f3f88f\r\n631c051fe6050ae8f8fc3321ed00802d(commit_message)\r\n631c051fe6050ae8f8fc3321ed00802d --> 3db0ee5d6ab83886bded5afd86f3f88f\r\n182194bab776fc9bc406ed573d621b68(repo)\r\n182194bab776fc9bc406ed573d621b68 --> 3db0ee5d6ab83886bded5afd86f3f88f\r\n0ee9f524d2db12be854fe611fa8126dd(result)\r\n3db0ee5d6ab83886bded5afd86f3f88f --> 0ee9f524d2db12be854fe611fa8126dd\r\nend\r\nsubgraph a6080d9c45eb5f806a47152a18bf7830[create_readme_file_if_not_exists]\r\nstyle a6080d9c45eb5f806a47152a18bf7830 fill:#fff4de,stroke:#cece71\r\n67e388f508dd96084c37d236a2c67e67[create_readme_file_if_not_exists]\r\n54faf20bfdca0e63d07efb3e5a984cf1(readme_contents)\r\n54faf20bfdca0e63d07efb3e5a984cf1 --> 67e388f508dd96084c37d236a2c67e67\r\n8c089c362960ccf181742334a3dccaea(repo)\r\n8c089c362960ccf181742334a3dccaea --> 67e388f508dd96084c37d236a2c67e67\r\n5cc65e17d40e6a7223c1504f1c4b0d2a(result)\r\n67e388f508dd96084c37d236a2c67e67 --> 5cc65e17d40e6a7223c1504f1c4b0d2a\r\nend\r\nsubgraph e7757158127e9845b2915c16a7fa80c5[readme_commit_message]\r\nstyle e7757158127e9845b2915c16a7fa80c5 fill:#fff4de,stroke:#cece71\r\n562bdc535c7cebfc66dba920b1a17540[readme_commit_message]\r\n0af5cbea9050874a0a3cba73bb61f892(issue_url)\r\n0af5cbea9050874a0a3cba73bb61f892 --> 562bdc535c7cebfc66dba920b1a17540\r\n2641f3b67327fb7518ee34a3a40b0755(result)\r\n562bdc535c7cebfc66dba920b1a17540 --> 2641f3b67327fb7518ee34a3a40b0755\r\nend\r\nsubgraph cf99ff6fad80e9c21266b43fd67b2f7b[readme_issue]\r\nstyle cf99ff6fad80e9c21266b43fd67b2f7b fill:#fff4de,stroke:#cece71\r\nda44417f891a945085590baafffc2bdb[readme_issue]\r\nd519830ab4e07ec391038e8581889ac3(body)\r\nd519830ab4e07ec391038e8581889ac3 --> da44417f891a945085590baafffc2bdb\r\n268852aa3fa8ab0864a32abae5a333f7(repo)\r\n268852aa3fa8ab0864a32abae5a333f7 --> da44417f891a945085590baafffc2bdb\r\n77a11dd29af309cf43ed321446c4bf01(title)\r\n77a11dd29af309cf43ed321446c4bf01 --> da44417f891a945085590baafffc2bdb\r\n1d2360c9da18fac0b6ec142df8f3fbda(result)\r\nda44417f891a945085590baafffc2bdb --> 1d2360c9da18fac0b6ec142df8f3fbda\r\nend\r\nsubgraph 7ec0442cf2d95c367912e8abee09b217[readme_pr]\r\nstyle 7ec0442cf2d95c367912e8abee09b217 fill:#fff4de,stroke:#cece71\r\nbb314dc452cde5b6af5ea94dd277ba40[readme_pr]\r\n127d77c3047facc1daa621148c5a0a1d(base)\r\n127d77c3047facc1daa621148c5a0a1d --> bb314dc452cde5b6af5ea94dd277ba40\r\ncb421e4de153cbb912f7fbe57e4ad734(body)\r\ncb421e4de153cbb912f7fbe57e4ad734 --> bb314dc452cde5b6af5ea94dd277ba40\r\ncbf7a0b88c0a41953b245303f3e9a0d3(head)\r\ncbf7a0b88c0a41953b245303f3e9a0d3 --> bb314dc452cde5b6af5ea94dd277ba40\r\ne5f9ad44448abd2469b3fd9831f3d159(origin)\r\ne5f9ad44448abd2469b3fd9831f3d159 --> bb314dc452cde5b6af5ea94dd277ba40\r\na35aee6711d240378eb57a3932537ca1(repo)\r\na35aee6711d240378eb57a3932537ca1 --> bb314dc452cde5b6af5ea94dd277ba40\r\ndfcce88a7d605d46bf17de1159fbe5ad(title)\r\ndfcce88a7d605d46bf17de1159fbe5ad --> bb314dc452cde5b6af5ea94dd277ba40\r\na210a7890a7bea8d629368e02da3d806(result)\r\nbb314dc452cde5b6af5ea94dd277ba40 --> a210a7890a7bea8d629368e02da3d806\r\nend\r\nsubgraph 227eabb1f1c5cc0bc931714a03049e27[readme_pr_body]\r\nstyle 227eabb1f1c5cc0bc931714a03049e27 fill:#fff4de,stroke:#cece71\r\n2aea976396cfe68dacd9bc7d4a3f0cba[readme_pr_body]\r\nc5dfd309617c909b852afe0b4ae4a178(readme_issue)\r\nc5dfd309617c909b852afe0b4ae4a178 --> 2aea976396cfe68dacd9bc7d4a3f0cba\r\n40ddb5b508cb5643e7c91f7abdb72b84(result)\r\n2aea976396cfe68dacd9bc7d4a3f0cba --> 40ddb5b508cb5643e7c91f7abdb72b84\r\nend\r\nsubgraph 48687c84e69b3db0acca625cbe2e6b49[readme_pr_title]\r\nstyle 48687c84e69b3db0acca625cbe2e6b49 fill:#fff4de,stroke:#cece71\r\nd8668ff93f41bc241c8c540199cd7453[readme_pr_title]\r\n3b2137dd1c61d0dac7d4e40fd6746cfb(readme_issue)\r\n3b2137dd1c61d0dac7d4e40fd6746cfb --> d8668ff93f41bc241c8c540199cd7453\r\n956e024fde513b3a449eac9ee42d6ab3(result)\r\nd8668ff93f41bc241c8c540199cd7453 --> 956e024fde513b3a449eac9ee42d6ab3\r\nend\r\nsubgraph d3ec0ac85209a7256c89d20f758f09f4[check_if_valid_git_repository_URL]\r\nstyle d3ec0ac85209a7256c89d20f758f09f4 fill:#fff4de,stroke:#cece71\r\nf577c71443f6b04596b3fe0511326c40[check_if_valid_git_repository_URL]\r\n7440e73a8e8f864097f42162b74f2762(URL)\r\n7440e73a8e8f864097f42162b74f2762 --> f577c71443f6b04596b3fe0511326c40\r\n8e39b501b41c5d0e4596318f80a03210(valid)\r\nf577c71443f6b04596b3fe0511326c40 --> 8e39b501b41c5d0e4596318f80a03210\r\nend\r\nsubgraph af8da22d1318d911f29b95e687f87c5d[clone_git_repo]\r\nstyle af8da22d1318d911f29b95e687f87c5d fill:#fff4de,stroke:#cece71\r\n155b8fdb5524f6bfd5adbae4940ad8d5[clone_git_repo]\r\need77b9eea541e0c378c67395351099c(URL)\r\need77b9eea541e0c378c67395351099c --> 155b8fdb5524f6bfd5adbae4940ad8d5\r\n8b5928cd265dd2c44d67d076f60c8b05(ssh_key)\r\n8b5928cd265dd2c44d67d076f60c8b05 --> 155b8fdb5524f6bfd5adbae4940ad8d5\r\n4e1d5ea96e050e46ebf95ebc0713d54c(repo)\r\n155b8fdb5524f6bfd5adbae4940ad8d5 --> 4e1d5ea96e050e46ebf95ebc0713d54c\r\n6a44de06a4a3518b939b27c790f6cdce{valid_git_repository_URL}\r\n6a44de06a4a3518b939b27c790f6cdce --> 155b8fdb5524f6bfd5adbae4940ad8d5\r\nend\r\nsubgraph d3d91578caf34c0ae944b17853783406[git_repo_default_branch]\r\nstyle d3d91578caf34c0ae944b17853783406 fill:#fff4de,stroke:#cece71\r\n546062a96122df465d2631f31df4e9e3[git_repo_default_branch]\r\n181f1b33df4d795fbad2911ec7087e86(repo)\r\n181f1b33df4d795fbad2911ec7087e86 --> 546062a96122df465d2631f31df4e9e3\r\n57651c1bcd24b794dfc8d1794ab556d5(branch)\r\n546062a96122df465d2631f31df4e9e3 --> 57651c1bcd24b794dfc8d1794ab556d5\r\n5ed1ab77e726d7efdcc41e9e2f8039c6(remote)\r\n546062a96122df465d2631f31df4e9e3 --> 5ed1ab77e726d7efdcc41e9e2f8039c6\r\n4c3cdd5f15b7a846d291aac089e8a622{no_git_branch_given}\r\n4c3cdd5f15b7a846d291aac089e8a622 --> 546062a96122df465d2631f31df4e9e3\r\nend\r\nend\r\nsubgraph a4827add25f5c7d5895c5728b74e2beb[Cleanup Stage]\r\nstyle a4827add25f5c7d5895c5728b74e2beb fill:#afd388b5,stroke:#a4ca7a\r\nend\r\nsubgraph 58ca4d24d2767176f196436c2890b926[Output Stage]\r\nstyle 58ca4d24d2767176f196436c2890b926 fill:#afd388b5,stroke:#a4ca7a\r\nend\r\nsubgraph inputs[Inputs]\r\nstyle inputs fill:#f6dbf9,stroke:#a178ca\r\n128516cfa09b0383023eab52ee24878a(seed<br>dffml.util.cli.CMD)\r\n128516cfa09b0383023eab52ee24878a --> e07552ee3b6b7696cb3ddd786222eaad\r\nba29b52e9c5aa88ea1caeeff29bfd491 --> cee6b5fdd0b6fbd0539cdcdc7f5a3324\r\n128516cfa09b0383023eab52ee24878a(seed<br>dffml.util.cli.CMD)\r\n128516cfa09b0383023eab52ee24878a --> 330f463830aa97e88917d5a9d1c21500\r\n128516cfa09b0383023eab52ee24878a(seed<br>dffml.util.cli.CMD)\r\n128516cfa09b0383023eab52ee24878a --> dc7c5f0836f7d2564c402bf956722672\r\nba29b52e9c5aa88ea1caeeff29bfd491 --> 58d8518cb0d6ef6ad35dc242486f1beb\r\n79e1ea6822bff603a835fb8ee80c7ff3 --> e824ae072860bc545fc7d55aa0bca479\r\n135ee61e3402d6fcbd7a219b0b4ccd73 --> e824ae072860bc545fc7d55aa0bca479\r\n40109d487bb9f08608d8c5f6e747042f --> 33d806f9b732bfd6b96ae2e9e4243a68\r\ndd5aab190ce844673819298c5b8fde76 --> 3786b4af914402320d260d077844620e\r\n9b92d5a346885079a2821c4d27cb5174 --> 3786b4af914402320d260d077844620e\r\n150204cd2d5a921deb53c312418379a1 --> 4f752ce18209f62ed749e88dd1f70266\r\nce14ca2191f2b1c13c605b240e797255 --> 2def8c6923c832adf33989b26c91295a\r\nda4270ecc44b6d9eed9809a560d24a28 --> f5548fcbcec8745ddf04104fc78e83a3\r\nc9f2ea5a7f25b3ae9fbf5041be5fa071 --> c0a0fa68a872adf890ed639e07ed5882\r\nc94383981c3a071b8c3df7293c8c7c92(seed<br>ContributingIssueBody)\r\nc94383981c3a071b8c3df7293c8c7c92 --> 821d21e8a69d1fa1757147e7e768f306\r\nda4270ecc44b6d9eed9809a560d24a28 --> 0581b90c76b0a4635a968682b060abff\r\n90c6a88275f27b28dc12f5741ac1652f(seed<br>ContributingIssueTitle)\r\n90c6a88275f27b28dc12f5741ac1652f --> 809719538467f6d0bf18f7ae26f08d80\r\n150204cd2d5a921deb53c312418379a1 --> bb85c3467b05192c99a3954968c7a612\r\n99a7dd1ae037153eef80e1dee51b9d2b --> 77f6c1c6b7ee62881b49c289097dfbde\r\n24292ae12efd27a227a0d6368ba01faa --> a0a2fabc65fe5601c7ea289124d04f70\r\nb4cff8d194413f436d94f9d84ece0262 --> cf92708915b9f41cb490b991abd6c374\r\nda4270ecc44b6d9eed9809a560d24a28 --> 210ae36c85f3597c248e0b32da7661ae\r\n77a2f9d4dfad5f520f1502e8ba70e47a --> 1700dc637c25bd503077a2a1422142e2\r\nc9f2ea5a7f25b3ae9fbf5041be5fa071 --> 6118470d0158ef1a220fe7c7232e1b63\r\nc9f2ea5a7f25b3ae9fbf5041be5fa071 --> 0fd26f9166ccca10c68e9aefa9c15767\r\n90b3c16d6d8884aa6f70b475d98f661b(seed<br>repo.directory.contributing.contents)\r\n90b3c16d6d8884aa6f70b475d98f661b --> 401c179bb30b24c2ca989c64d0b1cdc7\r\nda4270ecc44b6d9eed9809a560d24a28 --> dde78f81b1bdfe02c0a2bf6e51f65cb4\r\n21ccfd2c550bd853d28581f0b0c9f9fe(seed<br>default.branch.name)\r\n21ccfd2c550bd853d28581f0b0c9f9fe --> fdcb9b6113856222e30e093f7c38065e\r\ndd5aab190ce844673819298c5b8fde76 --> bdcf4b078985f4a390e4ed4beacffa65\r\n9b92d5a346885079a2821c4d27cb5174 --> bdcf4b078985f4a390e4ed4beacffa65\r\n5a5493ab86ab4053f1d44302e7bdddd6 --> ff47cf65b58262acec28507f4427de45\r\n57651c1bcd24b794dfc8d1794ab556d5 --> ff47cf65b58262acec28507f4427de45\r\n4e1d5ea96e050e46ebf95ebc0713d54c --> e58180baf478fe910359358a3fa02234\r\n40109d487bb9f08608d8c5f6e747042f --> c3bfe79b396a98ce2d9bfe772c9c20af\r\n2a1c620b0d510c3d8ed35deda41851c5 --> 4934c6211334318c63a5e91530171c9b\r\n2a1c620b0d510c3d8ed35deda41851c5 --> 5567dd8a6d7ae4fe86252db32e189a4d\r\n5ed1ab77e726d7efdcc41e9e2f8039c6 --> 6c2b36393ffff6be0b4ad333df2d9419\r\ndd5aab190ce844673819298c5b8fde76 --> 19a9ee483c1743e6ecf0a2dc3b6f8c7a\r\n9b92d5a346885079a2821c4d27cb5174 --> 19a9ee483c1743e6ecf0a2dc3b6f8c7a\r\ndd5aab190ce844673819298c5b8fde76 --> 68cf7d6869d027ca46a5fb4dbf7001d1\r\n9b92d5a346885079a2821c4d27cb5174 --> 68cf7d6869d027ca46a5fb4dbf7001d1\r\n150204cd2d5a921deb53c312418379a1 --> 37044e4d8610abe13849bc71a5cb7591\r\n2641f3b67327fb7518ee34a3a40b0755 --> 631c051fe6050ae8f8fc3321ed00802d\r\n2f9316539862f119f7c525bf9061e974 --> 182194bab776fc9bc406ed573d621b68\r\nd2708225c1f4c95d613a2645a17a5bc0(seed<br>repo.directory.readme.contents)\r\nd2708225c1f4c95d613a2645a17a5bc0 --> 54faf20bfdca0e63d07efb3e5a984cf1\r\n2f9316539862f119f7c525bf9061e974 --> 8c089c362960ccf181742334a3dccaea\r\n1d2360c9da18fac0b6ec142df8f3fbda --> 0af5cbea9050874a0a3cba73bb61f892\r\n1daacccd02f8117e67ad3cb8686a732c(seed<br>ReadmeIssueBody)\r\n1daacccd02f8117e67ad3cb8686a732c --> d519830ab4e07ec391038e8581889ac3\r\n2f9316539862f119f7c525bf9061e974 --> 268852aa3fa8ab0864a32abae5a333f7\r\n0c1ab2d4bda10e1083557833ae5c5da4(seed<br>ReadmeIssueTitle)\r\n0c1ab2d4bda10e1083557833ae5c5da4 --> 77a11dd29af309cf43ed321446c4bf01\r\n150204cd2d5a921deb53c312418379a1 --> 127d77c3047facc1daa621148c5a0a1d\r\n40ddb5b508cb5643e7c91f7abdb72b84 --> cb421e4de153cbb912f7fbe57e4ad734\r\n0ee9f524d2db12be854fe611fa8126dd --> cbf7a0b88c0a41953b245303f3e9a0d3\r\nb4cff8d194413f436d94f9d84ece0262 --> e5f9ad44448abd2469b3fd9831f3d159\r\n2f9316539862f119f7c525bf9061e974 --> a35aee6711d240378eb57a3932537ca1\r\n956e024fde513b3a449eac9ee42d6ab3 --> dfcce88a7d605d46bf17de1159fbe5ad\r\n1d2360c9da18fac0b6ec142df8f3fbda --> c5dfd309617c909b852afe0b4ae4a178\r\n1d2360c9da18fac0b6ec142df8f3fbda --> 3b2137dd1c61d0dac7d4e40fd6746cfb\r\n8d0adc31da1a0919724baf73d047743c --> 7440e73a8e8f864097f42162b74f2762\r\n8d0adc31da1a0919724baf73d047743c --> eed77b9eea541e0c378c67395351099c\r\na6ed501edbf561fda49a0a0a3ca310f0(seed<br>git_repo_ssh_key)\r\na6ed501edbf561fda49a0a0a3ca310f0 --> 8b5928cd265dd2c44d67d076f60c8b05\r\n8e39b501b41c5d0e4596318f80a03210 --> 6a44de06a4a3518b939b27c790f6cdce\r\n4e1d5ea96e050e46ebf95ebc0713d54c --> 181f1b33df4d795fbad2911ec7087e86\r\nend\r\n```\r\n\r\n- Notes\r\n - `create_*_if_not_exists` doesn't appear connected.\r\n- Only either README or CONTRIBUTING is currently being added when\r\n we run with our new CONTRIBUTING contribution flow overlayed.\r\n\r\n```console\r\n$ for pr in $(gh -R https://github.com/pdxjohnny/testaaaa pr list --json number --jq '.[].number'); do gh -R https://github.com/pdxjohnny/testaaaa pr close \"${pr}\"; done\r\n\u2713 Closed pull request #222 (Recommended Community Standard: README)\r\n\u2713 Closed pull request #219 (Recommended Community Standard: CONTRIBUTING)\r\n$ nodemon -e py --exec 'clear; for pr in $(gh -R https://github.com/pdxjohnny/testaaaa pr list --json number --jq '.[].number'); do gh -R https://github.com/pdxjohnny/testaaaa pr close \"${pr}\"; done; (alice please contribute -log debug -repos https://github.com/pdxjohnny/testaaaa -- recommended community standards; gh -R https://github.com/pdxjohnny/testaaaa pr list) 2>&1 | tee .output/$(date +%4Y-%m-%d-%H-%M).txt; test 1'\r\n$ less -S .output/$(ls .output/ | tail -n 1)\r\n```\r\n\r\n### Refactor into README and CONTRIBUTING Overlays\r\n\r\n- Had the thought, aren't we just adding a new context here?\r\n\r\n```diff\r\ndiff --git a/dffml/df/memory.py b/dffml/df/memory.py\r\nindex 59286d4927..87c75d637b 100644\r\n--- a/dffml/df/memory.py\r\n+++ b/dffml/df/memory.py\r\n@@ -377,6 +377,7 @@ class MemoryInputNetworkContext(BaseInputNetworkContext):\r\n self.ctxhd[handle_string].by_origin[item.origin] = []\r\n # Add input to by origin set\r\n self.ctxhd[handle_string].by_origin[item.origin].append(item)\r\n+ self.logger.debug(\"Added to %s: %r\", handle_string, item)\r\n\r\n async def uadd(self, *args: Input):\r\n \"\"\"\r\ndiff --git a/entities/alice/alice/please/contribute/recommended_community_standards/recommended_community_standards.py b/entities/alice/alice/please/contribute/recommended_community_standards/recommended_community_standards.py\r\nindex 2873a1b193..cc4d374e57 100644\r\n--- a/entities/alice/alice/please/contribute/recommended_community_standards/recommended_community_standards.py\r\n+++ b/entities/alice/alice/please/contribute/recommended_community_standards/recommended_community_standards.py\r\n@@ -1,7 +1,8 @@\r\n+import asyncio\r\n import pathlib\r\n import textwrap\r\n import itertools\r\n-from typing import NamedTuple, NewType, Optional\r\n+from typing import NamedTuple, NewType, Optional, Type, Any\r\n\r\n\r\n import dffml\r\n@@ -183,6 +184,34 @@ class OverlayGitHub:\r\n return remote\r\n\r\n\r\n+async def context_adder(\r\n+ self,\r\n+ upstream_cls: Type[Any],\r\n+ input_set_context: dffml.BaseInputSetContext,\r\n+ value: Any,\r\n+):\r\n+ upstream = dffml.DataFlow(*dffml.object_to_operations(upstream_cls))\r\n+ key, definition = list(self.parent.op.outputs.items())[0]\r\n+ async with self.octx.ictx.definitions(self.ctx) as definitions:\r\n+ await self.octx.ictx.cadd(\r\n+ input_set_context,\r\n+ dffml.Input(\r\n+ value=value,\r\n+ definition=definition,\r\n+ parents=None,\r\n+ origin=(self.parent.op.instance_name, key),\r\n+ ),\r\n+ *[\r\n+ item\r\n+ async for item in definitions.inputs()\r\n+ if (\r\n+ item.definition in upstream.definitions.values()\r\n+ and item.definition not in self.parent.op.inputs.values()\r\n+ )\r\n+ ],\r\n+ )\r\n+\r\n+\r\n # NOTE Not sure if the orchestrator will know what to do if we do this\r\n # ReadmeGitRepo = AliceGitRepo\r\n class ReadmeGitRepo(NamedTuple):\r\n@@ -204,6 +233,9 @@ class OverlayREADME:\r\n\r\n # async def cli_run_on_repo(self, repo: \"CLIRunOnRepo\"):\r\n async def alice_contribute_readme(self, repo: AliceGitRepo) -> ReadmeGitRepo:\r\n+ # await context_adder(\r\n+ # self, OverlayREADME, AliceGitRepoInputSetContext(repo), repo\r\n+ # )\r\n # TODO Clean this up once SystemContext refactor complete\r\n readme_dataflow_cls_upstream = OverlayREADME\r\n readme_dataflow_cls_overlays = dffml.Overlay.load(\r\n```\r\n\r\n```\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch Stage: PROCESSING: alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch Inputs: {'default_branch': 'master'}\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch Conditions: {}\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch Outputs: {'result': 'master'}\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:---\r\nDEBUG:dffml.MemoryInputNetworkContext:Added to https://github.com/pdxjohnny/testaaaa: Input(value=master, definition=repo.git.base.branch)\r\nDEBUG:dffml.MemoryLockNetworkContext:Acquiring: 6fc55525-c499-421c-8b07-497dd277b1ff(GitRepoSpec(directory='/tmp/dffml-feature-git-rrflb9gm', URL='https://github.com/pdxjohnny/testaaaa')) (now held by Operation(name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayCONTRIBUTING:alice_contribute_contributing', inputs={'repo': AliceGitRepo}, outputs={'result': ContributingGitRepo}, stage=<Stage.PROCESSING: 'processing'>, conditions=[], expand=[], instance_name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayCONTRIBUTING:alice_contribute_contributing', validator=False, retry=0))\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:---\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayCONTRIBUTING:alice_contribute_contributing Stage: PROCESSING: alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayCONTRIBUTING:alice_contribute_contributing\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayCONTRIBUTING:alice_contribute_contributing Inputs: {'repo': GitRepoSpec(directory='/tmp/dffml-feature-git-rrflb9gm', URL='https://github.com/pdxjohnny/testaaaa')}\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayCONTRIBUTING:alice_contribute_contributing Conditions: {}\r\nDEBUG:dffml.MemoryInputNetworkContext:Added to GitRepoSpec(directory='/tmp/dffml-feature-git-rrflb9gm', URL='https://github.com/pdxjohnny/testaaaa'): Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-rrflb9gm', URL='https://github.com/pdxjohnny/testaaaa'), definition=ContributingGitRepo)\r\nDEBUG:dffml.MemoryInputNetworkContext:Added to GitRepoSpec(directory='/tmp/dffml-feature-git-rrflb9gm', URL='https://github.com/pdxjohnny/testaaaa'): Input(value=origin, definition=writable.github.remote.origin)\r\nDEBUG:dffml.MemoryInputNetworkContext:Added to GitRepoSpec(directory='/tmp/dffml-feature-git-rrflb9gm', URL='https://github.com/pdxjohnny/testaaaa'): Input(value=master, definition=repo.git.base.branch)\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayCONTRIBUTING:alice_contribute_contributing Outputs: None\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:---\r\nDEBUG:dffml.MemoryLockNetworkContext:Acquiring: 6fc55525-c499-421c-8b07-497dd277b1ff(GitRepoSpec(directory='/tmp/dffml-feature-git-rrflb9gm', URL='https://github.com/pdxjohnny/testaaaa')) (now held by Operation(name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:create_branch_if_none_exists', inputs={'repo': AliceGitRepo, 'name': default.branch.name}, outputs={'result': git_branch}, stage=<Stage.PROCESSING: 'processing'>, conditions=[], expand=[], instance_name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:create_branch_if_none_exists', validator=False, retry=0))\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:---\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:create_branch_if_none_exists Stage: PROCESSING: alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:create_branch_if_none_exists\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:create_branch_if_none_exists Inputs: {'repo': GitRepoSpec(directory='/tmp/dffml-feature-git-rrflb9gm', URL='https://github.com/pdxjohnny/testaaaa'), 'name': 'main'}\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:create_branch_if_none_exists Conditions: {}\r\nDEBUG:dffml_feature_git.util:proc.create: ('git', 'branch', '-r')\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:create_branch_if_none_exists Outputs: None\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:---\r\nDEBUG:dffml.MemoryLockNetworkContext:Acquiring: 6fc55525-c499-421c-8b07-497dd277b1ff(GitRepoSpec(directory='/tmp/dffml-feature-git-rrflb9gm', URL='https://github.com/pdxjohnny/testaaaa')) (now held by Operation(name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', inputs={'repo': AliceGitRepo}, outputs={'result': ReadmeGitRepo}, stage=<Stage.PROCESSING: 'processing'>, conditions=[], expand=[], instance_name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', validator=False, retry=0))\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:---\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme Stage: PROCESSING: alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme Inputs: {'repo': GitRepoSpec(directory='/tmp/dffml-feature-git-rrflb9gm', URL='https://github.com/pdxjohnny/testaaaa')}\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme Conditions: {}\r\nDEBUG:dffml.MemoryInputNetworkContext:Added to GitRepoSpec(directory='/tmp/dffml-feature-git-rrflb9gm', URL='https://github.com/pdxjohnny/testaaaa'): Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-rrflb9gm', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)\r\nDEBUG:dffml.MemoryInputNetworkContext:Added to GitRepoSpec(directory='/tmp/dffml-feature-git-rrflb9gm', URL='https://github.com/pdxjohnny/testaaaa'): Input(value=origin, definition=writable.github.remote.origin)\r\nDEBUG:dffml.MemoryInputNetworkContext:Added to GitRepoSpec(directory='/tmp/dffml-feature-git-rrflb9gm', URL='https://github.com/pdxjohnny/testaaaa'): Input(value=master, definition=repo.git.base.branch)\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme Outputs: None\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:---\r\nDEBUG:dffml.MemoryInputNetworkContext:Received https://github.com/pdxjohnny/testaaaa result {} from <dffml.df.memory.MemoryOrchestratorContext object at 0x7f2405795d90>\r\nDEBUG:dffml.MemoryInputNetworkContext:Received https://github.com/pdxjohnny/testaaaa result {} from <dffml.df.memory.MemoryOrchestratorContext object at 0x7f2405795d90>\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.cli.OverlayCLI:cli_run_on_repo Outputs: None\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:---\r\nDEBUG:dffml.MemoryOrchestratorContext:ctx.outstanding: 1\r\nDEBUG:dffml.MemoryInputNetworkContext:Received 9eda82af632e2587d31fcd06d5fb0bfb1df47c4a8383e6a998f26c7c4906a86b result {} from <dffml.df.memory.MemoryOrchestratorContext object at 0x7f240588c040>\r\nDEBUG:dffml.MemoryOrchestratorContext:ctx.outstanding: 0\r\nhttps://github.com/pdxjohnny/testaaaa {}\r\n9eda82af632e2587d31fcd06d5fb0bfb1df47c4a8383e6a998f26c7c4906a86b {}\r\n```\r\n\r\n- Want to understand why we are not waiting for the contexts to complete which were added\r\n in above diff and logs.\r\n - Fallback plan is to call both from a function in a separate overlay until it's working\r\n this will just call `run_custom` via a helper function for both README and CONTRIBUTING\r\n overlays.\r\n - Going to write this first, then contributing new file tutorial\r\n - Then tutorial on `alice shouldi contribute` with overlay addition via installed to entrypoint\r\n - Then test with ability to add overlays via CLI as one offs\r\n - Final bit of each tutorial is to add to this fallback overlay\r\n - If we still have time before 8 AM then we'll try to debug\r\n- alice: please: contribute: recommended community standards: readme: Scope PR title and body definitions\r\n - 1cf1d73bcdb8f0940c02e01dec1e26253c2ec4cf\r\n- Tried with `dffml.run()`, it worked right away. Going with this.\r\n - 1bf5e4c9a4eae34f30f9c4b5c9a04d09d6a11c6e\r\n - alice: please: contribute: recommended community standards: readme: Use dffml.subflow_typecast to execute README contribution\r\n - 85d57ad8989bfb12d5fe0fb6eec21002ce75f271\r\n - high level: subflow typecast: Basic OpImpCtx helper\r\n - 8c0531e5364c09fec019d1971e4033401bfcbd2b\r\n - overlay: static overlay application with loading entrypoint dataflow class with overlays applied.\r\n - af4306a500daf11ba3c4c3db39c1da9879456d12\r\n - alice: please: contribute: recommended community standards: Disable OverlayMetaIssue in default installed set of overlays\r\n\r\n\r\n### How to help Alice contribute more files\r\n\r\nThis tutorial will help you create a new Open Architecture / Alice\r\noverlay which runs when another flow runs. The upstream flow\r\nin this case is the `AlicePleaseContributeRecommendedCommunityStandards`\r\nbase flow.\r\n\r\n- Copy readme overlay to new file\r\n\r\n```console\r\n$ cp alice/please/contribute/recommended_community_standards/readme.py alice/please/contribute/recommended_community_standards/contribute.py\r\n```\r\n\r\n- Rename types, classes, variables, etc.\r\n\r\n```console\r\n$ sed -e 's/Readme/Contributing/g' -e 's/README/CONTRIBUTING/g' -e 's/readme/contributing/g' -i alice/please/contribute/recommended_community_standards/contribute.py\r\n```\r\n\r\n```diff\r\ndiff --git a/entities/alice/entry_points.txt b/entities/alice/entry_points.txt\r\nindex 129b2866a1..9e130cb3b2 100644\r\n--- a/entities/alice/entry_points.txt\r\n+++ b/entities/alice/entry_points.txt\r\n@@ -9,6 +9,7 @@ CLI = alice.please.contribute.recomme\r\n OverlayGit = alice.please.contribute.recommended_community_standards.recommended_community_standards:OverlayGit\r\n OverlayGitHub = alice.please.contribute.recommended_community_standards.recommended_community_standards:OverlayGitHub\r\n OverlayREADME = alice.please.contribute.recommended_community_standards.recommended_community_standards:OverlayREADME\r\n+OverlayCONTRIBUTING = alice.please.contribute.recommended_community_standards.recommended_community_standards:OverlayCONTRIBUTING\r\n # OverlayMetaIssue = alice.please.contribute.recommended_community_standards.recommended_community_standards:OverlayMetaIssue\r\n\r\n [dffml.overlays.alice.please.contribute.recommended_community_standards.overlay.readme]\r\n```\r\n\r\n**dffml.git/entites/alice/entry_points.txt**\r\n\r\n```ini\r\n[dffml.overlays.alice.please.contribute.recommended_community_standards.overlay.contributing]\r\nOverlayGit = alice.please.contribute.recommended_community_standards.recommended_community_standards:OverlayGit\r\nOverlayGitHub = alice.please.contribute.recommended_community_standards.recommended_community_standards:OverlayGitHu\r\n```\r\n\r\n- Reinstall for new entrypoints to take effect\r\n\r\n```console\r\n$ python -m pip install -e .\r\n```\r\n\r\n- Re-run the command and observe results\r\n\r\n```console\r\nfor pr in $(gh -R https://github.com/$USER/ pr list --json number --jq '.[].number'); do gh -R https://github.com/pdxjohnny/testaaaa pr close \"${pr}\"; done; (alice please contribute -log debug -repos https://github.com/pdxjohnny/testaaaa -- recommended community standards; gh -R https://github.com/pdxjohnny/testaaaa pr list\r\n```\r\n\r\n![Screenshot showing pull request for adding README.md and CONTRIBUTING.md and CODE_OF_CONDUCT.md files](https://user-images.githubusercontent.com/5950433/181826046-53ae3ef5-6750-48ad-afd2-8cf9174e0b63.png)\r\n\r\n### Script to test Coach Alice Our Open Source Guide tutorial\r\n\r\n```bash\r\n#!/usr/bin/env bash\r\nset -x\r\nset -e\r\n\r\n# export USER=githubusername\r\nexport REPO_URL=\"https://github.com/$USER/my-new-python-project\"\r\n\r\ncd $(mktemp -d)\r\n\r\ngit clone --depth=1 -b alice https://github.com/intel/dffml dffml\r\ncd dffml/entities/alice\r\npython -m venv .venv\r\n. .venv/bin/activate\r\npython -m pip install -U pip setuptools wheel\r\npython -m pip install \\\r\n -e .[dev] \\\r\n -e ../../ \\\r\n -e ../../examples/shouldi/ \\\r\n -e ../../feature/git/ \\\r\n -e ../../operations/innersource/ \\\r\n -e ../../configloader/yaml/\r\n\r\ngh repo create -y --private \"${REPO_URL}\"\r\ngit clone \"${REPO_URL}\"\r\ncd my-new-python-project\r\necho 'print(\"Hello World\")' > test.py\r\ngit add test.py\r\ngit commit -sam 'Initial Commit'\r\ngit push --set-upstream origin $(git branch --show-current)\r\ncd ..\r\nrm -rf my-new-python-project\r\n\r\ncp alice/please/contribute/recommended_community_standards/readme.py alice/please/contribute/recommended_community_standards/code_of_conduct.py\r\n\r\nsed -e 's/Readme/CodeOfConduct/g' -e 's/README/CODE_OF_CONDUCT/g' -e 's/readme/code_of_conduct/g' -i alice/please/contribute/recommended_community_standards/code_of_conduct.py\r\n\r\nsed -i 's/OverlayREADME .*/&\\nOverlayCODE_OF_CONDUCT = alice.please.contribute.recommended_community_standards.code_of_conduct:OverlayCODE_OF_CONDUCT/' entry_points.txt\r\n\r\ntee -a entry_points.txt << 'EOF'\r\n\r\n[dffml.overlays.alice.please.contribute.recommended_community_standards.code_of_conduct]\r\nOverlayGit = alice.please.contribute.recommended_community_standards.recommended_community_standards:OverlayGit\r\nOverlayGitHub = alice.please.contribute.recommended_community_standards.recommended_community_standards:OverlayGitHub\r\nEOF\r\n\r\npython -m pip install -e .\r\n\r\nalice please contribute -log debug -repos \"${REPO_URL}\" -- recommended community standards\r\n\r\ngh -R \"${REPO_URL}\" pr list\r\n# 343 Recommended Community Standard: README alice-contribute-recommended-community-standards-readme OPEN\r\n# 341 Recommended Community Standard: CONTRIBUTING alice-contribute-recommended-community-standards-contributing OPEN\r\n# 339 Recommended Community Standard: CODE_OF_CONDUCT alice-contribute-recommended-community-standards-code_of_conduct OPEN\r\n\r\nfor pr in $(gh -R \"${REPO_URL}\" pr list --json number --jq '.[].number');\r\ndo\r\n gh -R \"${REPO_URL}\" pr close \"${pr}\"\r\ndone\r\n```\r\n\r\n- The Alice codebase\r\n\r\n```console\r\n$ find alice/please/ -type f | grep -v __init\r\nalice/please/contribute/recommended_community_standards/contributing.py\r\nalice/please/contribute/recommended_community_standards/cli.py\r\nalice/please/contribute/recommended_community_standards/readme.py\r\nalice/please/contribute/recommended_community_standards/meta_issue.py\r\nalice/please/contribute/recommended_community_standards/recommended_community_standards.py\r\n```\r\n\r\n### TODOs\r\n\r\n- Explain how to add more top level Alice CLI comamnds\r\n- Explain how to overlay shouldi flows beyond standard DFFML docs."
}
]
},
{
"body": "# 2022-08-22 Engineering Logs\r\n\r\n- SCITT\r\n - https://notes.ietf.org/notes-ietf-114-scitt\r\n - https://youtu.be/6B8Bv0naAIA\r\n - https://mailarchive.ietf.org/arch/msg/scitt/b1bvDwutpAdLI7sa7FzXrtkY_m0/\r\n - https://mailarchive.ietf.org/arch/msg/scitt/iEAhuuicVxgoXJiAZIGmpZOctcc/#",
"replies": [
{
"body": "## 2022-08-22 @pdxjohnny Engineering Logs\r\n\r\n- SCITT\r\n - https://notes.ietf.org/notes-ietf-114-scitt\r\n - https://youtu.be/6B8Bv0naAIA\r\n - https://mailarchive.ietf.org/arch/msg/scitt/b1bvDwutpAdLI7sa7FzXrtkY_m0/\r\n - https://mailarchive.ietf.org/arch/msg/scitt/iEAhuuicVxgoXJiAZIGmpZOctcc/#\r\n- TODO\r\n - [ ] Update with some of the very spotty wording above and try to flush it out with more conceptual meat now that the tone is established / future John has an example to work with.\r\n - https://github.com/intel/dffml/commit/9aeb7f19e541e66fc945c931801215560a8206d7\r\n - [ ] Update somewhere else in Vol 1 to include from\r\n - https://github.com/intel/dffml/blob/alice/docs/arch/alice/discussion/0015/reply_0002.md"
}
]
},
{
"body": "# 2022-08-24 Engineering Logs\r\n\r\n- SCITT\r\n - https://mailarchive.ietf.org/arch/msg/scitt/R56CX1LqSgDBRCzZIk3pZnJEV_c/\r\n - \u201c\r\nIn summary, a NIST Vulnerability Disclosure Report (VDR) is an attestation\r\nby a software vendor showing that the vendor has checked each component of a\r\nsoftware product SBOM for vulnerabilities and reports on the details of any\r\nvulnerabilities reported by a NIST NVD search. The VDR is a living document\r\nwhich the software vendor updates as needed when new vulnerabilities have\r\nbeen discovered and reported. A VDR is published whenever a software vendor\r\nissues a new or updated SBOM, including initial product release, making it\r\navailable online, all the time, to all customers of the product described in\r\nthe VDR. This gives software consumers that ability to answer the question\r\n\"What is the vulnerability status of my software product from Vendor V, as\r\nof NOW?\".\u201d\r\n - From VEX to VDR? Lets dive in more next week",
"replies": [
{
"body": "## 2022-08-24 @sedihglow Engineering Logs\r\n\r\n- Alice\r\n - Ran through contributing setup on local PC\r\n - https://github.com/intel/dffml/blob/alice/entities/alice/CONTRIBUTING.rst#cloning-the-repo\r\n- [ ] `alice please build if needed and run /path/to/repo`\r\n - Try two different repos, mainly focused on C\r\n - https://github.com/sedihglow/rpi4\r\n - https://github.com/sedihglow/red_black_tree\r\n\r\n```console\r\n$ sudo update-alternatives: using /usr/bin/python3.9 to provide /usr/local/bin/python (python) in auto mode\r\n$ sudo apt-get update && sudo apt-get install -y tmux python3.9 python3-pip python3.9-venv python3.9-dev build-essential\r\n```"
},
{
"body": "https://datatracker.ietf.org/doc/draft-ietf-rats-architecture/"
}
]
},
{
"body": "- Policy\r\n - ABC\u2019s of Conformity Assessment\r\n - https://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.2000-01.pdf\r\n - This might be helpful later when we write docs for / think about how to apply policy (see vol 0 introduction arch diagram)\r\n- SCITT\r\n - Zachary Newman shared looking at OpenSSF / SCITT terminology ran into same topics that we did when we brought up using shared underlying protocols and formats in the [2022-07-25 SCITT meeting](https://github.com/intel/dffml/discussions/1406#discussioncomment-3223361) when talking about RATs style attestation vs SLSA/in-toto/sigstore style.\r\n - https://mailarchive.ietf.org/arch/msg/scitt/utSOqlCifoorbqUGWNf-wMlBYR4/\r\n - Dick agrees with Zach's analysis. \"I've also been monitoring the OpenSSF Scorecard initiative, which goes beyond sigstore attestation checking to assign a \"trust score\". Not sure if this has traction, but there is a lot of activity on github. https://github.com/ossf/scorecard/blob/main/README.md#basic-usage OpenSSF does NOT appear to be following/implementing NIST C-SCRM recommendations and standards for Executive Order 14028 and consumer software labeling and other attestation recommendations; https://www.nist.gov/document/software-supply-chain-security-guidance-under-executive-order-eo-14028-section-4e\" [Dick Brooks]\r\n - Commit message to charter\r\n - > The Endor POC by `@OR13` was exemplary because there was a low amount of abstraction / extra information / steps introduced for the learner to understand the sequence of data transformations involved. It makes clear the contents of the serialization of choice (DIDs + VCs in Endor's case) and how that varies across the steps. The POC provided immediate value on the mailing list in a way that examples which introduce more abstraction layers are unable to do as quickly.\r\n >\r\n > We apply our recent learning from this success by adding to the charter the production of a similar example which in this patch we call \"file-based\", but we could change that to a more descriptive term if there is one. Having an example similar to the learning methodology presented via Endor would accelerate the pace at which developers up and down the software stack and in different programming languages would be able to adopt SCITT. This is due to the low level of abstraction introduced by it's file and shell based implementation. Files and shell commands translate easy into other languages where they can be slowly swapped out from initial fork/exec and equivalents to language code.\r\n >\r\n > The SCITT community could potentially provide documentation on how the fork/exec style implementation could be transformed into the HTTP server implementation. Due to the generic nature of SCITT and the many touchpoints various software systems will likely have with it in the future. It is important for us to consider as a part of our threat model the effect cohesive example documentation has on the correctness of downstream implementations. Providing cohesive examples where we start with the basics (file-based), moving to an example environment implementers are likely to be working in (HTTP-based), and finally explaining how we went from the basic to the complex would give a robust view of what SCITT should look like to implementers and provide them with a clear path to a hopefully correct implementation.\r\n > \r\n > More cohesive documentation will reduce the number of security vulnerabilities we see in our communities code. Code which is fundamentally about security in nature. This modification to the charter seeks to act on recent learnings around example code experienced within the SCITT community itself and seeks to contribute to the development of our threat model as we think about SCITT's lifecycle and rollout.\r\n - For this reason I propose we\r\n - where they will be creating the future of SCITT's robust, actively maintained solutions.\r\n - https://mailarchive.ietf.org/arch/msg/scitt/Hz9BSiIN7JHAgsZL6MuDHK4p7P8/\r\n - https://github.com/OR13/endor\r\n - This is learning methodology goldmine.\r\n - https://github.com/ietf-scitt/charter/pull/21\r\n - https://mailarchive.ietf.org/arch/msg/scitt/B9cwkueu3gdQ7lBKkhILcFLD0E4/\r\n- RATS\r\n - https://datatracker.ietf.org/doc/draft-ietf-rats-architecture/\r\n- SBOM\r\n - We [DFFML community] intend to use the \"living\" SBOM VDR capabilities to facilitate the breathing of life into our living threat models. This will allow us to facilitate vulns on architecture.\r\n - https://spdx.github.io/spdx-spec/v2.3/\r\n - https://energycentral.com/c/pip/what-nist-sbom-vulnerability-disclosure-report-vdr\r\n - > The recommendation by NIST to provide software consumers with a NIST VDR is gaining traction as a best practice. The latest version of the SPDX SBOM standard, version 2.3, includes provisions (K.1.9) enabling a software vendor to associate a specific SBOM document for a software product with its online NIST VDR attestation for that product, which is linked within the SBOM. The link refers to a \u201cliving\u201d SBOM VDR document that is updated by a software vendor, whenever new vulnerabilities are reported. Having this \u201calways updated NIST VDR\u201d available enables software consumers to answer the question \u201cWhat is the vulnerability status of my software product from Vendor V, as of NOW?\u201d, providing consumers with on-going, up-to-date visibility into the risks that may be present in an installed software product, as new vulnerabilities (CVE's) are being reported/released.\r\n >\r\n > As stated previously, NIST did not prescribe a format for a NIST VDR attestation, but guidance is provided on what a VDR includes. Reliable Energy Analytics (REA) has produced an open-source \u201cinterpretation\u201d of what a NIST VDR contains in order to meet EO 14028, which is available here in an XML Schema format with samples provided in XML and JSON (https://raw.githubusercontent.com/rjb4standards/REA-Products/master/SBOMVDR_JSON/VDR_118.json) formats.",
"replies": [
{
"body": "## 2022-08-29 @pdxjohnny Engineering Logs\r\n\r\n- Notes to self\r\n - Watched the progress report videos to make sure I know where we're at, thanks past Johns and others\r\n - Realized we should use `CITATION.cff` instead of `myconfig.json` in the examples under today's TODOs\r\n - They seem to form a cohesive if a bit rambling picture.\r\n - Reminded me why I quit caffeine. Sleep is important.\r\n - We could probably do for a 1 minute explainer video on what is Alice\r\n - Below \"Status\" would probably be a good way to start the day tomorrow as the 1 minute video with a breif bit about what is Alice at the begining.\r\n - Alice is our developer helper. We extend her to help us understand and preform various parts of the software development lifecycle. We extend her by writing simple Python functions which are easy for anyone to distribute or combine. She is based on a programming language agnostic format known as the Open Architecture. Eventually we will be able to extend any part of her in any language, or driven by machine learning models.\r\n- SCITT\r\n - Watched https://www.youtube.com/watch?v=6B8Bv0naAIA&list=PLtzAOVTpO2jYt71umwc-ze6OmwwCIMnLw&t=1320s\r\n - SCITT Architecture\r\n - ![image](https://user-images.githubusercontent.com/5950433/187310016-472934fb-e5cc-47e8-875d-a5ea93592074.png)\r\n - Dick's comment here on verification is related to a statement I'd made earlier today\r\n - https://www.youtube.com/watch?v=6B8Bv0naAIA&list=PLtzAOVTpO2jYt71umwc-ze6OmwwCIMnLw&t=1584s\r\n - https://github.com/ietf-scitt/charter/pull/18/files#r957557301\r\n - Roy\r\n - In the case of the notary we have the opportunity to allow for claims that last longer than they are supposed to. The notary concept will allow his buddies to control certs (effectively) on their servers sides.\r\n - Answer to: How's this related to sigstore?\r\n - In SCITT sigstore would send contents to SCITT instance and then notary would put it on a ledger\r\n - In the case of SLSA they also submit to the SCITT store, it looks like at the moment they just plug into one another\r\n - Concerns that we are too software centric with current prospective charter.\r\n - Point taken but they can't scope increase more.\r\n - We want to align efforts across SCITT and OpenSSF to ensure we all work in the same directions\r\n - We can expand to non software use cases later if we flush this out as is first and make sure to design it with extensibility in mind.\r\n - Reviewed https://github.com/ietf-scitt/charter/pull/18/files#diff-7dc19c29f46d126113e2e7fb7b70710fd0fd3100c95564297664f8ceae8c653eR8\r\n - \"For example, a public computer interface system could report its software composition, which can be compared against known software compositions for such a device, as recorded in a public append-only transparent registry.\" (https://github.com/ietf-scitt/charter/tree/60e628f1d718b69dc0d02f7a8168a5485f818201)\r\n - This sounds very similar to something we've talked about before which may be in a stream recording of how we identify the devices which aren't known to be running the \"machines serve humans\" rule, etc.\r\n - This is important for either SCITT or OA to address\r\n - https://github.com/ietf-scitt/charter/pull/18#pullrequestreview-1089013246\r\n- Status\r\n - We want to make sure the contribution process works and is clear. Then we will move on to the data collection portion. Remember we are working over time. We are building the entity at the center of the Trinity, Alice. Please contribute falls under our Static Analysis portion. The Open Architecture, SCITT, SBOM all are used in our top portion, Intent. We are building the entity using the architecture which we will use the represent the findings of our static and dynamic analysis.\r\n - Alice can make contributions, we've laid the foundations for the automation of the software development process. Our next step is to help her understand what she's looking at, what is the code, how can she use the source Luke? Later we'll get into more details on the dynamic analysis portion of the Trinity, where we'll work, over time, across many program executions of the code we are working on, to understand how it's execution maps to the work that we're doing via our understanding of what we've done (`please contribute`) and what we we're doing it on (`alice shouldi contribute`).\r\n - As such our top priorities right now are\r\n - Ensuring the contribution process to what exists (`alice please contribute`) is rock solid.\r\n - Building out and making `alice shouldi contribute` accessible and ready for contribution.\r\n - Engaging with those that are collecting metrics (https://metrics.openssf.org) and ensuring our work on metric collection bears fruit.\r\n - Following our engagement on the metric collection front we will preform analysis to determine how to best target further `alice please contribute` efforts and align the two with a documented process on how we select high value targets so that others can pick up and run with extending.\r\n - Participating organizations in parallel begin automated outreach via Alice please contribute\r\n- Game plan.\r\n - [x] `alice please contribute`\r\n - [x] Contribution ready\r\n - [ ] Demo on stream of how write install and publish a third party overlay\r\n - Have the overlay be a function which outputs a return type of `ContributingContents` and takes the name of the project given in a `CITATIONS.cff` file of the CONTRIBUTING example.\r\n - https://www.youtube.com/watch?v=TMlC_iAK3Rg&list=PLtzAOVTpO2jYt71umwc-ze6OmwwCIMnLw&index=5&t=2303\r\n - https://github.com/intel/dffml/blob/9aeb7f19e541e66fc945c931801215560a8206d7/entities/alice/alice/please/contribute/recommended_community_standards/contributing.py#L48-L54\r\n - [ ] Demo on stream how to write install and contribute a 1st/2nd party overlay, the same code just not third party, from start to finish.\r\n - [ ] `alice shouldi contribute`\r\n - [ ] Support caching / import / export dataflows\r\n - [ ] Support query in easy way (graphql)\r\n - [ ] Support joining with previous runs / more sets of data\r\n - [ ] Contribute the data OpenSSF cares about to their DB via applicable joins and queries\r\n - [ ] Email Christine and CRob\r\n- TODO\r\n - [ ] Organization\r\n - [ ] Daily addition by Alice to engineering log following template\r\n - [ ] Addition of old TODOs yesterday's logs\r\n - [ ] Export end state of input network / dump everything used by orchestrator\r\n - [ ] pickle\r\n - [ ] JSON\r\n - [ ] Ensure import works (check for state reset in `__aenter__()`, we probably need a generic wrapper to save the memory ones which populates after the `__aenter__()` of the wrapped object.\r\n - [ ] GraphQl query of cached state using strawberry library or something like that\r\n - [ ] Example docs for how to run a flow, then merge with static data as the start state for the cache and then query the whole bit with graphql\r\n\r\n---\r\n\r\nTitle: Software Supply Chain Security Guidance Under Executive Order (EO) 14028\r\nSection 4e\r\nFebruary 4, 2022 \r\nSource: https://www.nist.gov/system/files/documents/2022/02/04/software-supply-chain-security-guidance-under-EO-14028-section-4e.pdf\r\n\r\nTerminology\r\nSection 4e uses several terms, including \u201cconformity,\u201d \u201cattestation,\u201d and \u201cartifacts.\u201d Because EO 14028\r\ndoes not define these terms, this guidance presents the following definitions from existing standards\r\nand guidance:\r\n\u2022 Conformity assessment is a \u201cdemonstration that specified requirements are fulfilled.\u201d [ISO/IEC\r\n17000] In the context of Section 4e, the requirements are secure software development\r\npractices, so conformity assessment is a demonstration that the software producer has followed\r\nsecure software development practices for their software.\r\n\u2022 Attestation is the \u201cissue of a statement, based on a decision, that fulfillment of specified\r\nrequirements has been demonstrated.\u201d [ISO/IEC 17000]\r\n3\r\no If the software producer itself attests that it conforms to secure software development\r\npractices, this is known by several terms, including first-party attestation, selfattestation, declaration, and supplier\u2019s declaration of conformity (SDoC).\r\no If the software purchaser attests to the software producer\u2019s conformity with secure\r\nsoftware development practices, this is known as second-party attestation.\r\no If an independent third-party attests to the software producer\u2019s conformity with secure\r\nsoftware development practices, this is known as third-party attestation or\r\ncertification.\r\n\u2022 An artifact is \u201ca piece of evidence.\u201d [adapted from NISTIR 7692] Evidence is \u201cgrounds for belief\r\nor disbelief; data on which to base proof or to establish truth or falsehood.\u201d [NIST SP 800-160\r\nVol. 1] Artifacts provide records of secure software development practices.\r\no Low-level artifacts will be generated during software development, such as threat\r\nmodels, log entries, source code files, source code vulnerability scan reports, testing\r\nresults, telemetry, or risk-based mitigation decisions for a particular piece of software.\r\nThese artifacts may be generated manually or by automated means, and they are\r\nmaintained by the software producer.\r\no High-level artifacts may be generated by summarizing secure software development\r\npractices derived from the low-level artifacts. An example of a high-level artifact is a\r\npublicly accessible document describing the methodology, procedures, and processes a\r\nsoftware producer uses for its secure practices for software development.\r\nThe following subsections of EO 14028 Section 4e use these terms:\r\n(ii) generating and, when requested by a purchaser, providing artifacts that demonstrate\r\nconformance to the processes set forth in subsection (e)(i) of this section;\r\n(v) providing, when requested by a purchaser, artifacts of the execution of the tools and\r\nprocesses described in subsection (e)(iii) and (iv) of this section, and making publicly available\r\nsummary information on completion of these actions, to include a summary description of the\r\nrisks assessed and mitigated;\r\n(ix) attesting to conformity with secure software development practices;\r\nIn other words, when a federal agency (purchaser) acquires software or a product containing software,\r\nthe agency should receive attestation from the software producer that the software\u2019s development\r\ncomplies with government-specified secure software development practices. The federal agency might\r\nalso request artifacts from the software producer that support its attestation of conformity with the\r\nsecure software development practices described in Section 4e subsections (i), (iii), and (iv), which are\r\nlisted here:\r\n(i) secure software development environments, including such actions as:\r\n(A) using administratively separate build environments;\r\n(B) auditing trust relationships;\r\n4\r\n(C) establishing multi-factor, risk-based authentication and conditional access across the\r\nenterprise;\r\n(D) documenting and minimizing dependencies on enterprise products that are part of\r\nthe environments used to develop, build, and edit software;\r\n(E) employing encryption for data; and\r\n(F) monitoring operations and alerts and responding to attempted and actual cyber\r\nincidents;\r\n(iii) employing automated tools, or comparable processes, to maintain trusted source code\r\nsupply chains, thereby ensuring the integrity of the code;\r\n(iv) employing automated tools, or comparable processes, that check for known and potential\r\nvulnerabilities and remediate them, which shall operate regularly, or at a minimum prior to\r\nproduct, version, or update release;"
}
]
},
{
"body": "# 2022-08-30 Engineering Logs",
"replies": [
{
"body": "## 2022-08-30 @pdxjohnny Engineering Logs\r\n\r\n- SCITT\r\n - Responded to review from Henk\r\n - Questions around meaning of term \"file-based\"\r\n - The intent of using the term \"file-based\" was to have an example working with a static serialized form rather than working with a dynamic abstraction layer such as HTTP.\r\n - Updated both lines based on Henk's feedback into one line which addresses the core concern around ensuring the documentation is complete so we end up with a higher likelihood of solid implementations.\r\n - > HTTP-based REST API for Request-Response Interactions including a critical mass of examples as implementation guidance\r\n - https://github.com/ietf-scitt/charter/pull/21#pullrequestreview-1089717428\r\n- Game plan\r\n - [x] `alice please contribute`\r\n - [x] Contribution ready\r\n - [ ] Demo on stream of how write install and publish a third party overlay\r\n - Have the overlay be a function which outputs a return type of `ContributingContents` and takes the name of the project given in a `CITATIONS.cff` file of the CONTRIBUTING example.\r\n - https://github.com/johnlwhiteman/living-threat-models/blob/c027d4e319c715adce104b95f1e88623e02b0949/CITATION.cff\r\n - https://www.youtube.com/watch?v=TMlC_iAK3Rg&list=PLtzAOVTpO2jYt71umwc-ze6OmwwCIMnLw&index=5&t=2303\r\n - https://github.com/intel/dffml/blob/9aeb7f19e541e66fc945c931801215560a8206d7/entities/alice/alice/please/contribute/recommended_community_standards/contributing.py#L48-L54\r\n - [ ] Demo on stream how to write install and contribute a 1st/2nd party overlay, the same code just not third party, from start to finish.\r\n - [ ] `alice shouldi contribute`\r\n - [ ] Support caching / import / export dataflows\r\n - [ ] Support query in easy way (graphql)\r\n - [ ] Support joining with previous runs / more sets of data\r\n - [ ] Contribute the data OpenSSF cares about to their DB via applicable joins and queries\r\n - [ ] Email Christine and CRob\r\n- TODO\r\n - [ ] Organization\r\n - [ ] Daily addition by Alice to engineering log following template\r\n - [ ] Addition of old TODOs yesterday's logs\r\n - [ ] Export end state of input network / dump everything used by orchestrator\r\n - [ ] pickle\r\n - [ ] JSON\r\n - [ ] Ensure import works (check for state reset in `__aenter__()`, we probably need a generic wrapper to save the memory ones which populates after the `__aenter__()` of the wrapped object.\r\n - [ ] GraphQl query of cached state using strawberry library or something like that\r\n - [ ] Example docs for how to run a flow, then merge with static data as the start state for the cache and then query the whole bit with graphql\r\n- TODO\r\n - [ ] How to Publish an Alice Overlay\r\n - [ ] How to Contribute an Alice Overlay\r\n - [ ] Rolling Alice: 2022 Progress Reports: August Status Update\r\n - [ ] Rolling Alice: 2022 Progress Reports: August Activities Recap\r\n\r\n---\r\n\r\n### How to Publish an Alice Overlay\r\n\r\n- Metadata\r\n - Date: 2022-08-30 10:00 UTC -7\r\n- Docs we are following\r\n - https://github.com/intel/dffml/blob/alice/entities/alice/CONTRIBUTING.rst\r\n - https://github.com/intel/dffml/tree/alice/entities/alice#recommend-community-standards\r\n - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0001_coach_alice/0002_our_open_source_guide.md\r\n\r\n### How to Contribute an Alice Overlay\r\n\r\n- Metadata\r\n - Date: 2022-08-30 10:00 UTC -7\r\n\r\n\r\n### Rolling Alice: 2022 Progress Reports: August Status Update\r\n\r\n- Metadata\r\n - Date: 2022-08-30 16:28 UTC -7\r\n- https://www.youtube.com/watch?v=THKMfJpPt8I&list=PLtzAOVTpO2jYt71umwc-ze6OmwwCIMnLw&index=9\r\n- https://docs.google.com/presentation/d/1WBz-meM7n6nDe3-133tF1tlDQJ6nYYPySAdMgTHLb6Q/edit?usp=sharing\r\n- https://gist.github.com/pdxjohnny/07b8c7b4a9e05579921aa3cc8aed4866\r\n - Progress report transcripts\r\n- Hello entities of the internet!\r\n- We're building Alice, an Open Artificial General Intelligence, we invite you to join us.\r\n- Today is Alice\u2019s unbirthday. I\u2019m going tell you a little bit about Alice and the Open Architecture and give a brief status update on where we\u2019re at and how you can get involved.\r\n- Who is Alice?\r\n - Alice will be our developer helper and one day a developer herself. She helps us understand and preform various parts of the software development lifecycle.\r\n - We currently extend her by writing simple Python functions which can be distributed or combined in a decentralized way.\r\n - She is built around a programming language agnostic format known as the Open Architecture.\r\n - Eventually we will be able to extend any part of her in any language, or have parts be driven by machine learning models.\r\n- What is the Open Architecture?\r\n - It's the methodology that we use to interpret any domain specific description of architecture.\r\n - We are developing the open architecture so that we can do a one hop on analysis when looking at any piece of software from a security or other angle.\r\n - Having this generic method to describe any system architecture allows us to knit them together and assess their risk and threat model from a holistic viewpoint.\r\n- Why work on the Open Architecture?\r\n - We want this to be a machine and human interpretable format so that we can facilitate the validation of the reality of the code as it exists in it's static form, what it does when you execute it, and what we intend it to do.\r\n - Intent in our case is measured by conference to and completeness of the threat model, and therefore also the associated open architecture description.\r\n- The entity analysis Trinity\r\n - The entity analysis Trinity helps us conceptualize our process. The points on our Trinity are Intent, Dynamic Analysis, and Static Analysis.\r\n - By measuring and forming understanding in these areas we will be able to triangulate the strategic plans and principles involved in the execution of the software as well as it's development lifecycle.\r\n - We use the Trinity to represent the soul of the software.\r\n- What happens when we work on Alice?\r\n - We build up Alice's understanding of software engineering as we automate the collection of data which represents our understanding of it.\r\n - We also teach her how to automate parts of the development process, making contributions and other arbitrary things.\r\n - Over time we'll build up a corpus of training data from which we'll build machine learning models.\r\n - We will eventually introduce feedback loops where these models make decisions about development / contribution actions to be taken when given a codebase.\r\n - We want to make sure that when Alice is deciding what code to write and contribute, that she is following our organizationally applicable policies. As outlined maybe in part via our threat model.\r\n- Who is working on Alice?\r\n - The DFFML community and anyone and everyone who would like to join us.\r\n - Our objective is to build Alice with transparency, freedom, privacy, security, and egalitarianism as critical factors in her strategic principles.\r\n- How does one get involved?\r\n - You can get involved by engaging with the DFFML community via the following links\r\n - Every time we contribute new functionality to Alice we write a tutorial on how that functionality can be extended and customized.\r\n - We would love if you joined us in teaching Alice something about software development, or anything, and teaching others in the process.\r\n - It's as easy writing a single function and explaining your thought process.\r\n - The link on the left will take you to the code and tutorials.\r\n - We are also looking for folks who would like to contribute from by brainstorming and thinking about AI and especially AI ethics.\r\n - The link on the right will take you a document we are collaboratively editing and contributing to.\r\n- Now for a status update. (Progress to date)\r\n - Alice can make contributions, we've laid the foundations for the automation of the software development process.\r\n - Our next step is to help her understand what she's looking at, what is the code, how can she use the source Luke?\r\n- Plans\r\n - As such our top priorities right now are\r\n - Ensuring the contribution process to what exists (`alice please contribute`) is rock solid.\r\n - Building out and making `alice shouldi contribute` accessible and ready for contribution.\r\n - Engaging with those that are collecting metrics (https://metrics.openssf.org) and ensuring our work on metric collection bears fruit.\r\n - Following our engagement on the metric collection front we will preform analysis to determine how to best target further `alice please contribute` efforts and align the two with a documented process on how we select high value targets so that others can pick up and run with extending.\r\n - Participating organizations in parallel begin automated outreach via Alice please contribute\r\n - Later we'll get into more details on the dynamic analysis portion of the Trinity, where we'll work, over time, across many program executions of the code we are working on, to understand how it's execution maps to the work that we're doing via our understanding of what we've done (`please contribute`) and what we we're doing it on (`alice shouldi contribute`).\r\n- Unused\r\n - Alice's contribution docs have live for about a month. We're currently focused on making sure the contribution process works and is clear. Any and all feedback is appreciated.\r\n - After we're sure that Alice's contribution docs are solid we'll begin focus on her data mining capabilities.\r\n - We are building the entity at the center of the software/ entity analysis Trinity, Alice.\r\n - The `alice please contribute` command falls under the Static Analysis point on the trinity.\r\n - The Open Architecture, IETF SCITT, Web5, SBOM and other formats are all are used or plan to be used in top portion, Intent.\r\n - We are building the entity using the architecture. The intermediate and serialized forms of the Open Architecture will be use the represent the findings of our static and dynamic analysis.\r\n- TODO\r\n - [x] Slide Deck\r\n\r\n### Rolling Alice: 2022 Progress Reports: August Activities Recap\r\n\r\n- Metadata\r\n - Date: 2022-08-30 10:00 UTC -7"
},
{
"body": "https://github.com/opensbom-generator/spdx-sbom-generator"
},
{
"body": "https://huggingface.co/spaces/huggingface/diffuse-the-rest"
}
]
},
{
"body": "# 2022-08-31 Engineering Logs\r\n\r\n- SCITT\r\n - https://github.com/ietf-scitt/charter/pull/21",
"replies": [
{
"body": "## 2022-08-31 @pdxjohnny Engineering Logs\r\n\r\n- Game plan\r\n - [ ] `alice please contribute`\r\n - [x] README\r\n - [x] CONTRIBUTING\r\n - [x] CODE_OF_CONDUCT\r\n - https://www.youtube.com/watch?v=u2lGjMMIlAo&list=PLtzAOVTpO2ja6DXSCzoF3v_mQDh7l0ymH\r\n - https://github.com/intel/dffml/commit/6c1719f9ec779a9d64bfb3b364e2c41c5ac9aab7\r\n - [ ] SECURITY\r\n - [ ] SUPPORT\r\n - [ ] CITATION.cff\r\n - https://docs.github.com/en/repositories/managing-your-repositorys-settings-and-features/customizing-your-repository/about-citation-files\r\n - auto populate with 000 UUIDs\r\n - [ ] CODEOWNERS\r\n - https://docs.github.com/en/repositories/managing-your-repositorys-settings-and-features/customizing-your-repository/about-code-owners\r\n - [ ] Demo on stream of how write install and publish a third party overlay\r\n - Have the overlay be a function which outputs a return type of `ContributingContents` and takes the name of the project given in a `CITATIONS.cff` file as another our open source guide example.\r\n - https://docs.github.com/en/repositories/managing-your-repositorys-settings-and-features/customizing-your-repository/about-citation-files\r\n - https://github.com/johnlwhiteman/living-threat-models/blob/c027d4e319c715adce104b95f1e88623e02b0949/CITATION.cff\r\n - https://www.youtube.com/watch?v=TMlC_iAK3Rg&list=PLtzAOVTpO2jYt71umwc-ze6OmwwCIMnLw&index=5&t=2303\r\n - https://github.com/intel/dffml/blob/9aeb7f19e541e66fc945c931801215560a8206d7/entities/alice/alice/please/contribute/recommended_community_standards/contributing.py#L48-L54\r\n - [ ] Demo on stream how to write install and contribute a 1st/2nd party overlay, the same code just not third party, from start to finish.\r\n - CITATION.cff\r\n - [ ] `alice shouldi contribute`\r\n - [ ] Support caching / import / export dataflows\r\n - [ ] Support query in easy way (graphql)\r\n - [ ] Support joining with previous runs / more sets of data\r\n - [ ] Contribute the data OpenSSF cares about to their DB via applicable joins and queries\r\n - [ ] Email Christine and CRob\r\n- TODO\r\n - [ ] Organization\r\n - [ ] Daily addition by Alice to engineering log following template\r\n - [ ] Addition of old TODOs yesterday's logs\r\n - [ ] Export end state of input network / dump everything used by orchestrator\r\n - [ ] pickle\r\n - [ ] JSON\r\n - [ ] Ensure import works (check for state reset in `__aenter__()`, we probably need a generic wrapper to save the memory ones which populates after the `__aenter__()` of the wrapped object.\r\n - [ ] GraphQl query of cached state using strawberry library or something like that\r\n - [ ] Example docs for how to run a flow, then merge with static data as the start state for the cache and then query the whole bit with graphql\r\n- TODO\r\n - [x] Splice out Code of Conduct contribution demo from July progress report video\r\n - [x] Add PR and reference PR as example in tutorial along with spliced out `alice please contribute recommended community standards` contribution demo clip\r\n - [ ] How to Publish an Alice Overlay\r\n - [ ] How to Contribute an Alice Overlay\r\n - [ ] Rolling Alice: 2022 Progress Reports: August Activities Recap\r\n\r\n---\r\n\r\n### How to Publish an Alice Overlay\r\n\r\n- Metadata\r\n - Date: 2022-08-30 10:00 UTC -7\r\n- Docs we are following\r\n - https://github.com/intel/dffml/blob/alice/entities/alice/CONTRIBUTING.rst\r\n - https://github.com/intel/dffml/tree/alice/entities/alice#recommend-community-standards\r\n - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0001_coach_alice/0002_our_open_source_guide.md\r\n\r\n### How to Contribute an Alice Overlay\r\n\r\n- Metadata\r\n - Date: 2022-08-30 10:00 UTC -7\r\n\r\n### Rolling Alice: 2022 Progress Reports: August Activities Recap\r\n\r\n- Metadata\r\n - Date: 2022-08-30 10:00 UTC -7\r\n\r\n---\r\n\r\n- Failed attempt to get class defined variables with op decorated functions defined in dataflow classes\r\n - Can't inspect class currently being defined. Can't find the `NewType` references\r\n\r\n```diff\r\ndiff --git a/dffml/df/base.py b/dffml/df/base.py\r\nindex 4f84c1c7c8..df83d7b612 100644\r\n--- a/dffml/df/base.py\r\n+++ b/dffml/df/base.py\r\n@@ -345,7 +345,30 @@ def op(\r\n\r\n forward_refs_from_cls = None\r\n if hasattr(func, \"__qualname__\") and \".\" in func.__qualname__:\r\n+\r\n+ def stack_feedface(max_depth=4):\r\n+ from pprint import pprint\r\n+ # Grab stack frames\r\n+ frames = inspect.stack()\r\n+ for i, frame_info in enumerate(frames):\r\n+ pprint(frame_info)\r\n+ breakpoint()\r\n+ continue\r\n+ if max_depth != -1 and i >= max_depth:\r\n+ break\r\n+ if (\r\n+ frame_info.function == method_name\r\n+ and \"self\" in frame_info.frame.f_locals\r\n+ and frame_info.frame.f_locals[\"self\"] is obj\r\n+ ):\r\n+ return True\r\n+ return False\r\n+\r\n # Attempt to lookup type definitions defined within class\r\n+ if func.__qualname__.split(\".\")[0] == \"OverlayCODEOWNERS\":\r\n+ stack_feedface()\r\n+ breakpoint()\r\n+\r\n forward_refs_from_cls = getattr(\r\n sys.modules[func.__module__],\r\n func.__qualname__.split(\".\")[0],\r\n```"
}
]
},
{
"body": "# 2022-09-01 Engineering Logs\r\n\r\n- Community\r\n - \u201cHeros are not giant statues framed against a red sky. They are people who say this is my community, and it\u2019s my responsibility to make it better.\u201d [Oregon Governor Tom McCall]\r\n- WebUI\r\n - https://jsoncrack.com/editor\r\n - We could leverage JSON Crack to provide easy editing of seed data\r\n - Cloud fork and extend the JSON Crack project to add support for visualizing dataflows\r\n - Previously when using react-flow (https://github.com/wbkd/react-flow) we had used mermaid output SVG cords to find where to place nodes, we could probably just pull that code out of mermaid\r\n - We could do something like the Intuitive and Accessible Documentation Editing GSoC 2022 project where we swap out the mermaid diagram for the extended version of the JSON Crack editor to make the operations in the nodes editable. This is helpful when using operations such as `run_dataflow()` which can have alternate inputs. Any operation defined as a class `OperationImplementation`/`OperationImplementationContext` within the `run()` method of the context we can take the inputs as a dictionary as an argument.\r\n\r\n![image](https://user-images.githubusercontent.com/5950433/187969698-2d572d99-9f20-4618-b1bb-086add503f7e.png)\r\n\r\n![image](https://user-images.githubusercontent.com/5950433/187969864-3b38fcb4-de02-4e47-b57e-f8a62f0f8f11.png)\r\n\r\n![image](https://user-images.githubusercontent.com/5950433/187970084-ab027823-efce-4d42-8146-6b7caf12f328.png)",
"replies": [
{
"body": "## 2022-09-01 @pdxjohnny Engineering Logs\r\n\r\n- Game plan\r\n - [ ] `alice please contribute`\r\n - [x] README\r\n - [x] CONTRIBUTING\r\n - [x] CODE_OF_CONDUCT\r\n - https://www.youtube.com/watch?v=u2lGjMMIlAo&list=PLtzAOVTpO2ja6DXSCzoF3v_mQDh7l0ymH\r\n - https://github.com/intel/dffml/commit/6c1719f9ec779a9d64bfb3b364e2c41c5ac9aab7\r\n - [ ] SECURITY\r\n - [ ] SUPPORT\r\n - [ ] .gitignore\r\n - Dump files add common ignores, collect all inputs derived from file name and of type `GitIgnoreLine` using `group_by` in output flow\r\n - [ ] CITATION.cff\r\n - https://docs.github.com/en/repositories/managing-your-repositorys-settings-and-features/customizing-your-repository/about-citation-files\r\n - auto populate with 000 UUIDs\r\n - [ ] CODEOWNERS\r\n - https://docs.github.com/en/repositories/managing-your-repositorys-settings-and-features/customizing-your-repository/about-code-owners\r\n - [ ] Demo on stream of how write install and publish a third party overlay\r\n - Have the overlay be a function which outputs a return type of `ContributingContents` and takes the name of the project given in a `CITATIONS.cff` file as another our open source guide example.\r\n - https://docs.github.com/en/repositories/managing-your-repositorys-settings-and-features/customizing-your-repository/about-citation-files\r\n - https://github.com/johnlwhiteman/living-threat-models/blob/c027d4e319c715adce104b95f1e88623e02b0949/CITATION.cff\r\n - https://www.youtube.com/watch?v=TMlC_iAK3Rg&list=PLtzAOVTpO2jYt71umwc-ze6OmwwCIMnLw&index=5&t=2303\r\n - https://github.com/intel/dffml/blob/9aeb7f19e541e66fc945c931801215560a8206d7/entities/alice/alice/please/contribute/recommended_community_standards/contributing.py#L48-L54\r\n - [ ] Demo on stream how to write install and contribute a 1st/2nd party overlay, the same code just not third party, from start to finish.\r\n - CITATION.cff\r\n - [ ] `alice shouldi contribute`\r\n - [ ] Support caching / import / export dataflows\r\n - [ ] Support query in easy way (graphql)\r\n - [ ] Support joining with previous runs / more sets of data\r\n - [ ] Contribute the data OpenSSF cares about to their DB via applicable joins and queries\r\n - [ ] Email Christine and CRob\r\n- TODO\r\n - [ ] Organization\r\n - [ ] Daily addition by Alice to engineering log following template\r\n - [ ] Addition of old TODOs yesterday's logs\r\n - [ ] Export end state of input network / dump everything used by orchestrator\r\n - [ ] pickle\r\n - [ ] JSON\r\n - [ ] Ensure import works (check for state reset in `__aenter__()`, we probably need a generic wrapper to save the memory ones which populates after the `__aenter__()` of the wrapped object.\r\n - [ ] GraphQl query of cached state using strawberry library or something like that\r\n - [ ] Example docs for how to run a flow, then merge with static data as the start state for the cache and then query the whole bit with graphql\r\n- TODO\r\n - [ ] Sidestep failure to wrap with `@op` decorator on\r\n - [ ] `with dffml.raiseretry():` around `gh` grabbing issue title\r\n - Avoid potential resource not available yet after creation server side\r\n - [ ] `try: ... catch exception as error: raise RetryOperationException from error` in `run` (above `run_no_retry()`)\r\n - [ ] How to Publish an Alice Overlay\r\n - [ ] How to Contribute an Alice Overlay\r\n - [ ] Rolling Alice: 2022 Progress Reports: August Activities Recap\r\n\r\n---\r\n\r\n### How to Publish an Alice Overlay\r\n\r\n- Metadata\r\n - Date: 2022-08-30 10:00 UTC -7\r\n- Docs we are following\r\n - https://github.com/intel/dffml/blob/alice/entities/alice/CONTRIBUTING.rst\r\n - https://github.com/intel/dffml/tree/alice/entities/alice#recommend-community-standards\r\n - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0001_coach_alice/0002_our_open_source_guide.md\r\n\r\n### How to Contribute an Alice Overlay\r\n\r\n- Metadata\r\n - Date: 2022-08-30 10:00 UTC -7\r\n\r\n### Rolling Alice: 2022 Progress Reports: August Activities Recap\r\n\r\n- Metadata\r\n - Date: 2022-08-30 10:00 UTC -7"
},
{
"body": "## GSoC 2022: Intuitive and Accessible Documentation Editing: Checkpoint Meeting\r\n\r\n- https://github.com/intel/dffml/issues/1319\r\n- .gitpod.yml\r\n - https://github.com/pfmoore/editables\r\n - PEP 660 Fallout\r\n - https://github.com/pfmoore/editables/issues/21\r\n - https://github.com/intel/dffml/issues/1412\r\n - Trying to `dffml service dev docs` with JS to do `localstorage` tricks\r\n - Got gitpod env up and running and docs building and button auto adding on page load"
}
]
},
{
"body": "# 2022-09-02 Engineering Logs\r\n\r\n- SCITT\r\n - Explainer on IETF Supply Chain Integrity, Transparency, and Trust (SCITT) working group\r\n - The proposed SCITT charter sets two goals:\r\n - Standardize the overall security flows for securing a software supply chain, covering the essential building blocks that make up the architecture, and\r\n - specify these building blocks, employing the existing work already done within other IETF WGs such as COSE WG, and IETF RATS WG, as appropriate.\r\n - This is an example Use Case doc: https://github.com/ietf-scitt/use-cases/blob/main/hardware_microelectronics.md which might help as a quick primer to help understand what SCITT is about.\r\n - Here is the draft SCITT charter for background: https://datatracker.ietf.org/doc/charter-ietf-scitt/\r\n - Here is the draft SCITT architecture: https://datatracker.ietf.org/doc/draft-birkholz-scitt-architecture/\r\n - Here is a recent mailing list email with more context: https://mailarchive.ietf.org/arch/msg/scitt/ZefYIxvkC_I-sgXETVoJeaYwFB4/ \r\n - The charter has been currently scoped to software, but there are folks thinking about how it could be extended to other areas following implementation for software.\r\n - We're looking at a combination of SCITT plus overlays for threat modeling and policy as we analyze and communicate data on the software lifecycle for the OpenSSF Identifying Security Threats / Metrics WGs.\r\n - Aligned use cases\r\n - https://github.com/ietf-scitt/use-cases/issues/7\r\n - https://github.com/ietf-scitt/use-cases/issues/8\r\n - https://github.com/ietf-scitt/use-cases/issues/4\r\n - https://github.com/ietf-scitt/use-cases/issues/11\r\n - https://github.com/ietf-scitt/use-cases/issues/12\r\n- Completed v2 of Entity/System/Software Analysis Trinity\r\n - [EntityAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9479846/EntityAnalysisTrinity.drawio.xml.txt)\r\n - [EntityAnalysisTrinity.svg](https://user-images.githubusercontent.com/5950433/188203911-3586e1af-a1f6-434a-8a9a-a1795d7a7ca3.svg)\r\n - [EntityAnalysisTrinity.jpg](https://user-images.githubusercontent.com/5950433/188203498-2d7a9f50-ba1b-41ad-84b4-90434d4d9240.jpg)\r\n - [EntityAnalysisTrinity.png](https://user-images.githubusercontent.com/5950433/188203501-45e00b72-1d1e-4dc4-b3ca-3fd445369c8d.png)\r\n - [EntityAnalysisTrinity.pdf](https://github.com/intel/dffml/files/9479847/EntityAnalysisTrinity.drawio.xml.txt.drawio.pdf)\r\n\r\n![EntityAnalysisTrinity drawio xml txt](https://user-images.githubusercontent.com/5950433/188203911-3586e1af-a1f6-434a-8a9a-a1795d7a7ca3.svg)\r\n",
"replies": [
{
"body": "## 2022-09-02 @pdxjohnny Engineering Logs\r\n\r\n- Game plan\r\n - [ ] `alice please contribute`\r\n - [x] README\r\n - [x] CONTRIBUTING\r\n - [x] CODE_OF_CONDUCT\r\n - https://www.youtube.com/watch?v=u2lGjMMIlAo&list=PLtzAOVTpO2ja6DXSCzoF3v_mQDh7l0ymH\r\n - https://github.com/intel/dffml/commit/6c1719f9ec779a9d64bfb3b364e2c41c5ac9aab7\r\n - [ ] SECURITY\r\n - [ ] SUPPORT\r\n - [ ] .gitignore\r\n - Dump files add common ignores, collect all inputs derived from file name and of type `GitIgnoreLine` using `group_by` in output flow\r\n - [ ] CITATION.cff\r\n - https://docs.github.com/en/repositories/managing-your-repositorys-settings-and-features/customizing-your-repository/about-citation-files\r\n - auto populate with 000 UUIDs\r\n - [ ] CODEOWNERS\r\n - https://docs.github.com/en/repositories/managing-your-repositorys-settings-and-features/customizing-your-repository/about-code-owners\r\n - [ ] Demo on stream of how write install and publish a third party overlay\r\n - Have the overlay be a function which outputs a return type of `ContributingContents` and takes the name of the project given in a `CITATIONS.cff` file as another our open source guide example.\r\n - https://docs.github.com/en/repositories/managing-your-repositorys-settings-and-features/customizing-your-repository/about-citation-files\r\n - https://github.com/johnlwhiteman/living-threat-models/blob/c027d4e319c715adce104b95f1e88623e02b0949/CITATION.cff\r\n - https://www.youtube.com/watch?v=TMlC_iAK3Rg&list=PLtzAOVTpO2jYt71umwc-ze6OmwwCIMnLw&index=5&t=2303\r\n - https://github.com/intel/dffml/blob/9aeb7f19e541e66fc945c931801215560a8206d7/entities/alice/alice/please/contribute/recommended_community_standards/contributing.py#L48-L54\r\n - [ ] Demo on stream how to write install and contribute a 1st/2nd party overlay, the same code just not third party, from start to finish.\r\n - CITATION.cff\r\n - [ ] `alice shouldi contribute`\r\n - [ ] Support caching / import / export dataflows\r\n - [ ] Support query in easy way (graphql)\r\n - [ ] Support joining with previous runs / more sets of data\r\n - [ ] Contribute the data OpenSSF cares about to their DB via applicable joins and queries\r\n - [ ] Email Christine and CRob\r\n- TODO\r\n - [ ] Organization\r\n - [ ] Daily addition by Alice to engineering log following template\r\n - [ ] Addition of old TODOs yesterday's logs\r\n - [ ] Export end state of input network / dump everything used by orchestrator\r\n - [ ] pickle\r\n - [ ] JSON\r\n - [ ] Ensure import works (check for state reset in `__aenter__()`, we probably need a generic wrapper to save the memory ones which populates after the `__aenter__()` of the wrapped object.\r\n - [ ] GraphQl query of cached state using strawberry library or something like that\r\n - [ ] Example docs for how to run a flow, then merge with static data as the start state for the cache and then query the whole bit with graphql\r\n- TODO\r\n - [ ] Sidestep failure to wrap with `@op` decorator on\r\n - [ ] `with dffml.raiseretry():` around `gh` grabbing issue title\r\n - Avoid potential resource not available yet after creation server side\r\n - [ ] `try: ... catch exception as error: raise RetryOperationException from error` in `run` (above `run_no_retry()`)\r\n - [ ] How to Publish an Alice Overlay\r\n - [ ] How to Contribute an Alice Overlay\r\n - [ ] Rolling Alice: 2022 Progress Reports: August Activities Recap\r\n\r\n---\r\n\r\n### How to Publish an Alice Overlay\r\n\r\n- Metadata\r\n - Date: 2022-08-30 10:00 UTC -7\r\n- Docs we are following\r\n - https://github.com/intel/dffml/blob/alice/entities/alice/CONTRIBUTING.rst\r\n - https://github.com/intel/dffml/tree/alice/entities/alice#recommend-community-standards\r\n - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0001_coach_alice/0002_our_open_source_guide.md\r\n\r\n### How to Contribute an Alice Overlay\r\n\r\n- Metadata\r\n - Date: 2022-08-30 10:00 UTC -7\r\n\r\n### Raise Retry from Exception for Problematic Operations\r\n\r\n- Metadata\r\n - Date: 2022-09-02 11:20 UTC -7\r\n- `with dffml.raiseretry():` around `gh` grabbing issue title\r\n - Avoid potential resource not available yet after creation server side\r\n- `try: ... catch exception as error: raise RetryOperationException from error` in `run` (above `run_no_retry()`)\r\n\r\n```diff\r\ndiff --git a/dffml/df/base.py b/dffml/df/base.py\r\nindex 4f84c1c7c..b2d23a678 100644\r\n--- a/dffml/df/base.py\r\n+++ b/dffml/df/base.py\r\n@@ -15,11 +15,12 @@ from typing import (\r\n Union,\r\n Optional,\r\n Set,\r\n+ ContextManager,\r\n )\r\n from dataclasses import dataclass, is_dataclass, replace\r\n from contextlib import asynccontextmanager\r\n \r\n-from .exceptions import NotOpImp\r\n+from .exceptions import NotOpImp, RetryOperationException\r\n from .types import (\r\n Operation,\r\n Input,\r\n@@ -94,6 +95,7 @@ class OperationImplementationContext(BaseDataFlowObjectContext):\r\n self.parent = parent\r\n self.ctx = ctx\r\n self.octx = octx\r\n+ self.op_retries = None\r\n \r\n @property\r\n def config(self):\r\n@@ -102,6 +104,31 @@ class OperationImplementationContext(BaseDataFlowObjectContext):\r\n \"\"\"\r\n return self.parent.config\r\n \r\n+\r\n+ @contextlib.contextmanager\r\n+ def raiseretry(retries: int) -> ContextManager[None]:\r\n+ \"\"\"\r\n+ Use this context manager to have the orchestrator call the operation's\r\n+ ``run()`` method multiple times within the same\r\n+ OperationImplementationContext entry.\r\n+\r\n+ Useful for\r\n+\r\n+ TODO\r\n+\r\n+ - Backoff\r\n+\r\n+ >>> def myop(self):\r\n+ ... with self.raiseretry(5):\r\n+ ... if self.op_current_retry < 4:\r\n+ ... raise Exception()\r\n+ \"\"\"\r\n+ try:\r\n+ yield\r\n+ except Exception as error:\r\n+ raise RetryOperationException(retries) from error\r\n+\r\n+\r\n @abc.abstractmethod\r\n async def run(self, inputs: Dict[str, Any]) -> Union[bool, Dict[str, Any]]:\r\n \"\"\"\r\ndiff --git a/dffml/df/exceptions.py b/dffml/df/exceptions.py\r\nindex b1f3bcc87..e185cf22c 100644\r\n--- a/dffml/df/exceptions.py\r\n+++ b/dffml/df/exceptions.py\r\n@@ -28,3 +28,8 @@ class ValidatorMissing(Exception):\r\n \r\n class MultipleAncestorsFoundError(NotImplementedError):\r\n pass\r\n+\r\n+\r\n+class RetryOperationException(Exception):\r\n+ def __init__(self, retires: int) -> None:\r\n+ self.retires = retires\r\ndiff --git a/dffml/df/memory.py b/dffml/df/memory.py\r\nindex 59286d492..ca0a77cc6 100644\r\n--- a/dffml/df/memory.py\r\n+++ b/dffml/df/memory.py\r\n@@ -26,6 +26,7 @@ from .exceptions import (\r\n DefinitionNotInContext,\r\n ValidatorMissing,\r\n MultipleAncestorsFoundError,\r\n+ RetryOperationException,\r\n )\r\n from .types import (\r\n Input,\r\n@@ -1187,6 +1188,7 @@ class MemoryOperationImplementationNetworkContext(\r\n ctx: BaseInputSetContext,\r\n octx: BaseOrchestratorContext,\r\n operation: Operation,\r\n+ opctx: OperationImplementationContext,\r\n inputs: Dict[str, Any],\r\n ) -> Union[bool, Dict[str, Any]]:\r\n \"\"\"\r\n@@ -1195,9 +1197,7 @@ class MemoryOperationImplementationNetworkContext(\r\n # Check that our network contains the operation\r\n await self.ensure_contains(operation)\r\n # Create an opimp context and run the operation\r\n- async with self.operations[operation.instance_name](\r\n- ctx, octx\r\n- ) as opctx:\r\n+ with contextlib.nullcontext():\r\n self.logger.debug(\"---\")\r\n self.logger.debug(\r\n \"%s Stage: %s: %s\",\r\n@@ -1248,22 +1248,28 @@ class MemoryOperationImplementationNetworkContext(\r\n \"\"\"\r\n Run an operation in our network.\r\n \"\"\"\r\n- if not operation.retry:\r\n- return await self.run_no_retry(ctx, octx, operation, inputs)\r\n- for retry in range(0, operation.retry):\r\n- try:\r\n- return await self.run_no_retry(ctx, octx, operation, inputs)\r\n- except Exception:\r\n- # Raise if no more tries left\r\n- if (retry + 1) == operation.retry:\r\n- raise\r\n- # Otherwise if there was an exception log it\r\n- self.logger.error(\r\n- \"%r: try %d: %s\",\r\n- operation.instance_name,\r\n- retry + 1,\r\n- traceback.format_exc().rstrip(),\r\n- )\r\n+ async with self.operations[operation.instance_name](\r\n+ ctx, octx\r\n+ ) as opctx:\r\n+ opctx.retries = operation.retry\r\n+ for retry in range(0, operation.retry):\r\n+ try:\r\n+ return await self.run_no_retry(ctx, octx, operation, opctx, inputs)\r\n+ except Exception:\r\n+ if isinstance(error, RetryOperationException):\r\n+ retries = error.retries\r\n+ if not retries\r\n+ raise\r\n+ # Raise if no more tries left\r\n+ if (retry + 1) == retries:\r\n+ raise\r\n+ # Otherwise if there was an exception log it\r\n+ self.logger.error(\r\n+ \"%r: try %d: %s\",\r\n+ operation.instance_name,\r\n+ retry + 1,\r\n+ traceback.format_exc().rstrip(),\r\n+ )\r\n \r\n async def operation_completed(self):\r\n await self.completed_event.wait()\r\ndiff --git a/entities/alice/alice/please/contribute/recommended_community_standards/readme.py b/entities/alice/alice/please/contribute/recommended_community_standards/readme.py\r\nindex 437601358..836d8f175 100644\r\n--- a/entities/alice/alice/please/contribute/recommended_community_standards/readme.py\r\n+++ b/entities/alice/alice/please/contribute/recommended_community_standards/readme.py\r\n@@ -183,10 +183,11 @@ class OverlayREADME:\r\n \"\"\"\r\n Use the issue title as the pull request title\r\n \"\"\"\r\n- async for event, result in dffml.run_command_events(\r\n- [\"gh\", \"issue\", \"view\", \"--json\", \"title\", \"-q\", \".title\", readme_issue,],\r\n- logger=self.logger,\r\n- events=[dffml.Subprocess.STDOUT],\r\n- ):\r\n- if event is dffml.Subprocess.STDOUT:\r\n- return result.strip().decode()\r\n+ with self.raiseretry(5):\r\n+ async for event, result in dffml.run_command_events(\r\n+ [\"gh\", \"issue\", \"view\", \"--json\", \"title\", \"-q\", \".title\", readme_issue,],\r\n+ logger=self.logger,\r\n+ events=[dffml.Subprocess.STDOUT],\r\n+ ):\r\n+ if event is dffml.Subprocess.STDOUT:\r\n+ return result.strip().decode()\r\n```"
},
{
"body": "https://www.cnn.com/2022/09/03/tech/ai-art-fair-winner-controversy/index.html"
}
]
},
{
"body": "- TODO\r\n - Messagw Alice on signal to add ti this thread",
"replies": []
},
{
"body": "# 2022-09-06 Engineering Logs\r\n\r\n- References\r\n - https://madebyoll.in/posts/game_emulation_via_dnn/\r\n - https://e2eml.school/transformers.html\r\n - Thought: context aware markov\r\n - https://ieeexplore.ieee.org/document/9540871\r\n - https://twitter.com/konstinx/status/1567036083862396932",
"replies": [
{
"body": "## 2022-09-06 @pdxjohnny Engineering Logs\r\n\r\n- User reports need for bypass the validation on insert of each record to mongodb source.\r\n - https://www.mongodb.com/docs/manual/core/schema-validation/bypass-document-validation/\r\n - > To bypass the validation rules and insert the invalid document, run the following `insert` command, which sets the `bypassDocumentValidation` option to `true`:\r\n > ```javascript\r\n > db.runCommand( {\r\n > insert: \"students\",\r\n > documents: [\r\n > {\r\n > name: \"Alice\",\r\n > year: Int32( 2016 ),\r\n > major: \"History\",\r\n > gpa: Double(3.0),\r\n > address: {\r\n > city: \"NYC\",\r\n > street: \"33rd Street\"\r\n > }\r\n > }\r\n > ],\r\n > bypassDocumentValidation: true\r\n > } )\r\n > ```\r\n- References\r\n - https://duckduckgo.com/?q=validation+level+mongodb&t=canonical&ia=web\r\n - https://www.mongodb.com/docs/compass/current/validation/\r\n - https://www.mongodb.com/docs/manual/core/schema-validation/\r\n - https://www.mongodb.com/docs/manual/core/schema-validation/specify-validation-level/#std-label-schema-specify-validation-level\r\n - https://www.mongodb.com/docs/manual/core/schema-validation/bypass-document-validation/\r\n- Updating `MongoDBSource`\r\n- References\r\n - https://duckduckgo.com/?q=motor+mongo+asyncio+bypassDocumentValidation&t=canonical&ia=web\r\n - https://motor.readthedocs.io/en/stable/tutorial-asyncio.html#inserting-a-document\r\n - https://motor.readthedocs.io/en/stable/api-asyncio/asyncio_motor_collection.html#motor.motor_asyncio.AsyncIOMotorCollection.insert_one\r\n - > *bypass_document_validation* requires server version **>= 3.2**\r\n - *bypass_document_validation*: (optional) If `True`, allows the write to opt-out of document level validation. Default is `False`.\r\n - https://github.com/intel/dffml/blob/7627341b66f6209b85ea4ae74e3fb4159d125d30/source/mongodb/dffml_source_mongodb/source.py#L32-L39\r\n - https://motor.readthedocs.io/en/stable/api-asyncio/asyncio_motor_collection.html#motor.motor_asyncio.AsyncIOMotorCollection.replace_one\r\n- TODO\r\n - [ ] Docs on on open source async first development model in a way which is a quick onramp to the fully connected development model.\r\n - [ ] Allow for user to bypass the validation on insert of each record to mongodb source."
},
{
"body": "## GSoC 2022: Intuitive and Accessible Documentation Editing: Meeting\r\n\r\n- https://github.com/intel/dffml/issues/1392"
},
{
"body": "Recompute / repripirtiize / associte higher priortiiy with markov chains regeneratated from most recently appllicabke context"
}
]
},
{
"body": "# 2022-09-07 Engineering Logs",
"replies": [
{
"body": "## 2022-09-07 @pdxjohnny Engineering Logs\r\n\r\n- Update Trinity to v3: Fix direction of short loop arrows\r\n - [EntityAnalysisTrinity.svg](https://user-images.githubusercontent.com/5950433/188937161-f107af83-50dd-4deb-a951-1aebf9762a31.svg)\r\n - [EntityAnalysisTrinity.jpg](https://user-images.githubusercontent.com/5950433/188937164-88bd4773-bc37-4c28-ba01-945b6c729f42.jpg)\r\n - [EntityAnalysisTrinity.pdf](https://github.com/intel/dffml/files/9508224/EntityAnalysisTrinity.drawio.xml.txt.drawio.pdf)\r\n - [EntityAnalysisTrinity.png](https://user-images.githubusercontent.com/5950433/188937146-876ada14-60fd-41d6-953b-652099168a22.png)\r\n - [EntityAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9508223/EntityAnalysisTrinity.drawio.xml.txt)\r\n\r\n![EntityAnalysisTrinity.svg](https://user-images.githubusercontent.com/5950433/188937161-f107af83-50dd-4deb-a951-1aebf9762a31.svg)\r\n\r\n- All information will be taggable\r\n - Not all information will be tagged\r\n - We are adding links, like a giant version of Wikipedia\r\n- TODO\r\n - [ ] Deduplicate docs code as we unify operations, data flows, and classes who no longer need separate config dumping code now that everything hooks into the `typing` system."
},
{
"body": "## SCITT Reference Implementation\r\n\r\n- Goal\r\n - Example graph for one use case\r\n- Search engines auto query RDF JSONLD\r\n - VCs are in RDF by default so you get the graph for free\r\n- Kiran from Microsoft, hardware background\r\n- Orie from Transmute\r\n- IETF goals are to define building blocks and keep it generic\r\n- It makes sense to have a reference implementation\r\n - What level do we want?\r\n - Toy\r\n - Hosted\r\n - Ecosystem\r\n- Let's build code along with the spec\r\n- SCITT building blocks are so far out from sandardisation pro\r\n- Fundamentally supply chain is about peices that interact\r\n - Best hting we can do is workshop\r\n- Transmute is implementing examples to show SCITT will work for hardware as well\r\n - Orie will have some use cases which will have payloads which will have cliams which might be SBOMs\r\n - This way we both mention how SBOM would be a node in the graph so it helps us work out common use cases\r\n- If we had these claims? What kinds of questions could we awnser\r\n- How is an issuer releated to a software artifact, related to a CVe, in a couple example payload formats\r\n- Intent to define example payloads and places to collect them\r\n - Let's have the converstatoin on the mailing list\r\n - Feedback may be that Payload specifics are out of scope for the work\r\n - We still what to talk about what kind of knowledge we want to represent with these opace payloads\r\n - We can start and OpenSSF Use Case doc\r\n - https://github.com/ietf-scitt/use-cases/issues/14\r\n - John to send out email to mailing list and add ID sec threatds group with to as Mike."
}
]
},
{
"body": "# 2022-09-08 Engineering Logs\r\n\r\n- https://github.com/Wilfred/difftastic",
"replies": [
{
"body": "## 2022-09-08 @pdxjohnny Engineering Logs\r\n\r\n- The Entity Analysis Trinity can help us conceptualize how to manifest realities via it's lifecycle feedback loop over time.\r\n - https://twitter.com/ParissAthena/status/1567690882865926144\r\n- https://cwarny.medium.com/an-illustration-of-software-2-0-3937f620cea1\r\n - Rajesh and I talked about how Alice is a hybrid of (what is called in the referenced blog post) \"software 1.0\" and \"software 2.0\".\r\n - Alice is a hybrid of software 1.0 and 2.0. We leverage the Open Architecture and Living Threat Models to apply context aware policy to both paradigms.\r\n - It\u2019s important to do depth of field research so that one can understand discourse within a community\r\n - interacting with open source communities explainer?\r\n- https://twitter.com/lorenc_dan/status/1567874273913585665\r\n - Came across Dan's tweet\r\n - Reminded me of: https://github.com/intel/dffml/issues/1315#issuecomment-1066814280\r\n - ![Anarchy Elmo Says \u201cChaos for the Chaos God\u201d](https://user-images.githubusercontent.com/5950433/189168046-a20c0973-b49f-41be-82b5-a66ef53f853d.jpeg)\r\n - Interns having root may be a CISO\u2019s nightmare but it\u2019s Alice\u2019s dream. A learning wonderland.\r\n - Wondered who the chaos god is so did a search\r\n - The God of Chaos is considered the one God\r\n - https://greekgodsandgoddesses.net/gods/chaos/\r\n - The Hebrew God, also known as the God of knowledge, is also considered the one God\r\n - 110fbeeed4580b05144deea8f2fdbb6793b7f7be\r\n- Finally reading the Alice (#1369) discussion thread again first pass since writing it\r\n - This is what I mean when I say \"read the thread\":\r\n - `git log --reverse -p --oneline -- docs/arch/alice/discussion/`\r\n - c6a0dafeae527c5e102abd3ee69189cdfb5e9450\r\n - First mention of the system context was almost immediately, although it wasn't until 2148e16f11a5b5941f19353924ca92e497f81b2a we realized we'd found it\r\n - 3c26ea48b\r\n - > With A/B field testing of new system contexts (changes, running dev branches against dev branches). We start to see a complete picture of the fully connected dev model. We can proactively pair changes from one system context with another system context, both overlayed over a base system context. This is when you have two devs working on two feature branches and both have active PRs. They can now effectively work together because they have this translation, this transparent overlay of their respective diffs to the upstream system context (data flow or program flow in this example).\r\n - https://github.com/intel/dffml/blob/3c26ea48b9d3b66648ef3d676fd015ce171a8761/docs/arch/alice/discussion/0035/reply_0010.md\r\n - Hmmm, we may have stumbled onto the start of the OpenSSF use case doc\r\n - Hey, `git grep` is our friend, let's look for anything talking about CVEs, VEX, vulns, and see if we can scrape together a skeleton use case doc for https://github.com/ietf-scitt/use-cases/issues/14\r\n - A deal is made: 361555718b5ad589a9430efbd0ed88e7bc0582c3 & 4ef226e2ecd384560d635fa84036003b525ad399\r\n - Software supply chain\r\n - https://github.com/intel/dffml/blob/alice/docs/arch/alice/discussion/0036/reply_0062.md\r\n - "
}
]
},
{
"body": "# 2022-09-09 Engineering Logs",
"replies": [
{
"body": "## 2022-09-09 @pdxjohnny Engineering Logs\r\n\r\n- https://nightingaledvs.com/how-to-visualize-a-graph-with-a-million-nodes/\r\n- "
},
{
"body": "- Manifest Schema docs reference addition\r\n - https://medium.com/mcdonalds-technical-blog/mcdonalds-event-driven-architecture-the-data-journey-and-how-it-works-4591d108821f\r\n- Graph million nodes\r\n - https://nightingaledvs.com/how-to-visualize-a-graph-with-a-million-nodes/\r\n - https://cosmograph.app/\r\n- How to choose which data visualization to display / generate for slide decks / presentations\r\n - > Data Visualization Types\r\n - https://www.tapclicks.com/resources/blog/data-visualization-types/\r\n\r\n![FE0CBB03-CF41-4C24-B281-97A7419DB540](https://user-images.githubusercontent.com/5950433/189486866-014dd24a-5f7a-4370-9fbd-d476231fd558.jpeg)\r\n\r\n- A win for shouldi deptree\r\n - > Use data-dist-info-metadata (PEP 658) to decouple resolution from downloading\r\n - https://github.com/pypa/pip/pull/11111"
},
{
"body": "- Great talk from Brooklyn on Edge and Web5\r\n - https://youtu.be/a6fvZA0L-ok\r\n- Good overview of k8s\r\n- https://huggingface.co/bigscience/bloom\r\n - GPT-3"
}
]
},
{
"body": "2",
"replies": [
{
"body": "- L34 through 6\r\n- L229"
}
]
},
{
"body": "# 2022-09-12 Engineering Logs",
"replies": [
{
"body": "## 2022-09-12 @pdxjohnny Engineering Logs\r\n\r\n- https://github.com/kubernetes-sigs/image-builder\r\n - https://github.com/imjasonh/kontain.me\r\n - https://github.com/imjasonh/kontain.me/blob/main/pkg/serve/serve.go\r\n - secrets in last layer for k8s orch\r\n- https://twitter.com/pchaigno/status/1439965320056344577?s=20&t=snDh0RTRB1FYmv2AEeIuWQ\r\n- TOOD\r\n - [ ] DataFlow execution within linux loader to do attestation to secret service and set in env before execing `__start`\r\n - configure NFS then mount as volume via preapply. Use this to cache cloned repos and execute pull instead of clone to resolve deltas for iterative scanning over time.\r\n - subflow reuse ictx output operation grab inputs with definitions who are decents of STATIC and CACHED and NFS (eventually NFS and kubernetes stuff should be overlays)\r\n - Threaded execution of sets of contexts"
}
]
},
{
"body": "# 2022-09-13 Engineering Logs\r\n\r\n- GSoC 2022\r\n - https://summerofcode.withgoogle.com/organizations/python-software-foundation/projects/details/4tE547Oz\r\n - https://summerofcode.withgoogle.com/organizations/python-software-foundation/projects/details/gNdNxmFb\r\n- OpenSSF\r\n - SBOM Everywhere\r\n - https://github.com/ossf/sbom-everywhere/issues/12\r\n - https://docs.google.com/document/d/1iCL7NOSxIc7YpVI2NRANIy46pM-02G_WlPexQqqb2R0/edit\r\n - > - Level 1: clients and SDKs \u2014 Operating system and build system-agnostic command line interpreters (CLIs) that can process source and build output artifacts / as well as process operating system and other dependencies. That output a compliant SBOM that includes the necessary data that addresses all use cases. These tools should be able to be run in a manual or automated (e.g., scripted) fashion as part of an end-to-end CI/CD workflow. These tools will include SDKs that developers can use to customize and extend any base tools, for instance to support additional package managers.\r\n > - Level 2: package manager plugins \u2014 a set of plugins or modules that work natively with the major package managers and repositories such as Maven, npm, and PyPI. These tools will typically require a single line configuration change added in order to run with each subsequent build and will output compliant SBOMs. This work will enhance the best existing open source plugins where they exist.\r\n > - Level 3: native package manager integration \u2014 by adding native SBOM generation functionality to major package managers, all developers and all build systems will automatically generate SBOMs by default as part of their normal workflow. SBOM generation will become as common and seamless as tooling creating log entries for software builds in a log file behind the scenes.\r\n > - Level 4: containerization integration \u2014 by adding native SBOM generation functionality to the containerization build process, the system will use SBOM content provided by included packages plus additional artifacts added during container build to output an SBOM that specifies all the components that make up a container.\r\n > - Level 5: application/solution integration/deployment \u2014 When deploying an application consisting of multiple disparate components (containers, machine images, event driven services) the coordination manager should aggregate the constituent SBOMS to reflect all artifacts that are deployed.",
"replies": [
{
"body": "## 2022-09-13 @pdxjohnny Engineering Logs\r\n\r\n```console\r\n$ dffml service dev export alice.cli:ALICE_COLLECTOR_DATAFLOW | tee alice_collector_dataflow.json\r\n$ (date; (echo URL && sed -e 's/^.*/https:\\/\\/github.com\\/dffml\\/&/' org-repo-list | head -n 1) | dffml dataflow run records all -no-echo -record-def URL -dataflow alice_collector_dataflow.json -sources src=csv dst=mongodb -source-src-filename /dev/stdin -source-src-key URL -source-dst-uri \"${DATABASE_CONNECTION_STRING}\" -source-dst-tlsInsecure -source-dst-log_collection_names -source-dst-collection mycollection -orchestrator kubernetes.job -orchestrator-workdir . -log debug -no-strict -orchestrator-max_ctxs 25 -orchestrator-image docker.io/intel-otc/dffml:latest 2>&1; date) | tee ~/alice-shouldi-contribute-mycollection-$(date +%4Y-%m-%d-%H-%M).txt\r\n...\r\nDEBUG:dffml.JobKubernetesOrchestratorContext:context_path.stat().st_size: 60876856\r\nDEBUG:dffml.JobKubernetesOrchestratorContext:dffml_path.stat().st_size: 157628\r\nERROR:dffml.JobKubernetesOrchestratorContext:Traceback for <Task finished name='Task-7' coro=<JobKubernetesOrchestratorContext.run_operations_for_ctx() done, defined at /src/dffml/dffml/df/kubernetes.py:213> exception=RuntimeError('[\\'kubectl\\', \\'--context\\', \\'kind-kind\\', \\'apply\\', \\'-o=json\\', \\'-k\\', \\'.\\']: Error from server: error when creating \".\": the server responded with the status code 413 but did not return more information (post secrets)\\n')> (most recent call last):\r\n File \"/src/dffml/dffml/df/kubernetes.py\", line 780, in run_operations_for_ctx\r\n raise Exception(\r\n File \"/src/dffml/dffml/util/subprocess.py\", line 140, in run_command\r\n pass\r\n File \"/src/dffml/dffml/util/subprocess.py\", line 83, in run_command_events\r\n raise RuntimeError(\r\nRuntimeError: ['kubectl', '--context', 'kind-kind', 'apply', '-o=json', '-k', '.']: Error from server: error when creating \".\": the server responded with the status code 413 but did not return more information (post secrets)\r\nTraceback (most recent call last):\r\n File \"/home/coder/.local/bin/dffml\", line 33, in <module>\r\n sys.exit(load_entry_point('dffml', 'console_scripts', 'dffml')())\r\n File \"/src/dffml/dffml/util/cli/cmd.py\", line 282, in main\r\n result = loop.run_until_complete(cls._main(*argv[1:]))\r\n File \"/.pyenv/versions/3.9.13/lib/python3.9/asyncio/base_events.py\", line 647, in run_until_complete\r\n return future.result()\r\n File \"/src/dffml/dffml/util/cli/cmd.py\", line 248, in _main\r\n return await cls.cli(*args)\r\n File \"/src/dffml/dffml/util/cli/cmd.py\", line 234, in cli\r\n return await cmd.do_run()\r\n File \"/src/dffml/dffml/util/cli/cmd.py\", line 211, in do_run\r\n return [res async for res in self.run()]\r\n File \"/src/dffml/dffml/util/cli/cmd.py\", line 211, in <listcomp>\r\n return [res async for res in self.run()]\r\n File \"/src/dffml/dffml/cli/dataflow.py\", line 283, in run\r\n async for record in self.run_dataflow(\r\n File \"/src/dffml/dffml/cli/dataflow.py\", line 268, in run_dataflow\r\n async for ctx, results in octx.run(\r\n File \"/src/dffml/dffml/df/memory.py\", line 1721, in run\r\n task.result()\r\n File \"/src/dffml/dffml/df/kubernetes.py\", line 355, in run_operations_for_ctx\r\n await run_command(\r\n File \"/src/dffml/dffml/util/subprocess.py\", line 137, in run_command\r\n async for _, _ in run_command_events(\r\n File \"/src/dffml/dffml/util/subprocess.py\", line 83, in run_command_events\r\n raise RuntimeError(\r\nRuntimeError: ['kubectl', '--context', 'kind-kind', 'apply', '-o=json', '-k', '.']: Error from server: error when creating \".\": the server responded with the status code 413 but did not return more information (post sec\r\n```\r\n\r\n- TODO\r\n - [ ] Update Job based Kubernetes Orchestrator to add a note that sometimes a `preapply` is needed to set the limits (required to be set by the namespace?)\r\n - https://github.com/intel/dffml/blob/3e157b391ffc36b6073288d0fe7a21a6a82b55a4/dffml/df/kubernetes.py#L1048-L1108\r\n```"
}
]
},
{
"body": "# 2022-09-14 Engineering Logs\r\n\r\nIn put networj which resolves or syntehises pipeline orchestrator specifc workflow/job to run data flow effectively using workflow/job syntax as trampoline bacj into dataflow, pull orchestrator secrets applicably\r\n\r\n```console\r\n$ echo -e 'if [[ \"x${RUN_ME}\" != \"x\" ]]; then\\n ${RUN_ME}\\nfi' | RUN_ME='echo hi' bash\r\nhi\r\n```",
"replies": [
{
"body": "- Cattle not pets with state\r\n - Reaching equilibrium with Alice assisted communication faster to bring new nodes into correct place, similar to Graph Neural Network group drone flight work.\r\n\r\n![821D10AA-B705-4667-9F99-98C231BD58A9](https://user-images.githubusercontent.com/5950433/190293910-85bd0d08-0461-400f-8258-16ee161e2a2f.jpeg)\r\n\r\n- shim used with synthesis to manifest ingesting job with matrix to trampoline via orchestrator specific call to index job\r\n- People always have [\u201cright of way\u201d](https://en.m.wikipedia.org/wiki/International_Regulations_for_Preventing_Collisions_at_Sea#Part_B_.E2.80.93_Steering_and_sailing) over machines (example: cars)\r\n- Blames on ths file to the graph with aithors so we know whos most recent point of ckbtact like krnel cc for quering to ask for help (survey)\r\n- How to run a on tmux / ssh entry to shell\r\n- References\r\n - https://www.baeldung.com/linux/remove-last-n-lines-of-file\r\n\r\n```console\r\n$ echo -e 'if [[ \"x${RUN_ME}\" != \"x\" ]]; then\\n ${RUN_ME}\\nfi' | RUN_ME='echo hi' >> ~/.bashrc\r\n$ sed -i \"$(( $(wc -l <~/.bashrc)-3+1 )),$ d\" ~/.bashrc\r\n$ diff ~/.bashrc ~/.bashrc.bak\r\n173a174,176\r\n> if [[ \"x${RUN_ME}\" != \"x\" ]]; then\r\n> ${RUN_ME}\r\n> fi\r\n```"
},
{
"body": "# Architecting Alice: A Shared Stream of Consciousness\r\n\r\n> Moved to: https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0005_stream_of_consciousness.md\r\n\r\nIn this tutorial we use dataflow as class to build Input, Redundancy,\r\nLock, Operation, and Operation Implementation Networks which interact\r\nwith ActiveMQ and Redis. These will enable us to overlay them on\r\nAlice's process local memory resident implementations to facilitate\r\na shared stream of consciousness.\r\n\r\nWe then show how two different instances of Alice can communicate where\r\nsome operation implementations are instantiated in one process space,\r\nand some in another, we'll then watch them run a simple flow which\r\nprint the message \"Alice Online\" and \"Bob Online\" to each side.\r\n\r\n```mermaid\r\ngraph TD\r\n developer_online --> notify_dev_online\r\n```\r\n\r\n```python\r\ndef developer_online() -> DeveloperOnlineName:\r\n return getpass.getuser()\r\n\r\ndef notify_dev_online(developer: DeveloperOnlineName):\r\n print(f\"{developer.title() Online\")\r\n```\r\n\r\nLater in Architecting Alice, we'll add in rekor to get data\r\nprovenance and put the whole bit behind an HTTP API. We validate data\r\nusing SCITT. We could optionally require passes from filter operations.\r\nCould add in more mixins to rekor to check on addition.\r\n\r\nIn Coach Alice, we'll see these techniques used to support caching of\r\ncomplex inputs such as directory trees (creating new inputs on load\r\nby inspecting cached state overlayed). Our work with the OpenSSF\r\nmeans that we'll want to be scanning lots of VCS (git, svn, etc.) repos.\r\nWe'll use this to cache those repos and restore repos from cached state,\r\nthen run an update for the delta, then save back to cache. This way\r\nwe can avoid running the full download for larger repos. Small repos\r\nwe can examine past runs to estimate size and just clone every time\r\nto avoid resource usage of caching. This will building on our Architecting Alice Webhook Based Distributed Compute leveraging Jenkins (~~if rate limit for github doesnt apply to workflow dispatch then build off that~~ https://docs.github.com/en/actions/learn-github-actions/usage-limits-billing-and-administration#usage-limits) and the Manifest concept.\r\n\r\nIn Coach Alice we'll also see how we can use this distributed stream\r\nof consciousness to assist with developer communication. We can enable\r\ndevelopers to give Alice dataflows which she runs in the background.\r\nShe can then say oh the dev API server restarted (maybe it's your or\r\nyour friends laptop running the API, or a real server). This gives\r\nthe same impact for both users, a little `notify-send` popup.\r\n\r\n- References\r\n - https://activemq.apache.org/python\r\n - For Python support we recommend the [Python Stomp Client](http://stomp.github.com/implementations.html)\r\n - https://stomp.github.io/implementations.html\r\n- Future\r\n - Notify on diff to discussion thread or git repo with upleveling"
}
]
},
{
"body": "# 2022-09-15 Engineering Logs",
"replies": [
{
"body": "## 2022-09-15 Open Architecture\r\n\r\n- OA: SCITT for provenance (SPDX DAG for DAG?) plus overlayed (aka generic admission controller, return 0/1) policy. Use example from yesterday, psudo code release flow with checks to SCITT as if it was a BOM/database being added to as the product is built. Come up with places where policy is relevant: incoming vuln, package, sign, release (dont sign unless X, dont release unless Y, new vuln? Run policy check to determine if it effects your arch, take actions (re-roll with updated dep) acrodingly\r\n- Relized SCITT will probably still not define the graph\r\n - Looking for the SPDX DAG work or antyhing like it: https://www.google.com/search?hl=en&q=spdx%20%22dag%22&tbs=qdr%3Am\r\n- References\r\n - https://github.com/git-bom/gitbom-rs/issues/18\r\n - > There was a discussion in today's GitBOM meeting about the utility of separating generation of gitoids from the generation of a GitBOM DAG. (@)edwarnicke has implemented this split in Go (https://github.com/edwarnicke/gitoid) (WIP) and described it as being a valuable change. The idea is that by splitting this out, other uses of gitoids can be explored.\r\n - https://github.com/edwarnicke/gitoid\r\n- SCITT\r\n - https://github.com/ietf-scitt/charter/blob/master/ietf-scitt-charter.md\r\n - https://github.com/ietf-scitt/use-cases/blob/main/hardware_microelectronics.md\r\n - https://datatracker.ietf.org/doc/html/draft-birkholz-scitt-architecture\r\n > ```\r\n > Artifact\r\n > |\r\n > v +------------------+\r\n > Issuer -> Statement Envelope | DID Key Manifest |\r\n > \\ / | (decentralized) |\r\n > \\ / +------------------+\r\n > \\ ______/ | |\r\n > | | |\r\n > v signature | |\r\n > Claim <--------------/ |\r\n > | |\r\n > | Claim +---------+ |\r\n > |------------>| Trans. | |\r\n > Transparency -> +<------------| Registry| /\r\n > Service | Receipt +--------+ X\r\n > v / \\\r\n > Transparent / \\\r\n > Claim / |\r\n > |\\ / |\r\n > | \\ / |\r\n > | \\ / |\r\n > Verifier -> | Verify Claim |\r\n > | |\r\n > Auditor -> Collect Receipts Replay Registry\r\n > ```"
},
{
"body": "## 2022-09-15 @pdxjohnny Engineering Logs\r\n\r\n- Intuitive and Accessible Docs github device vscoder esq flow\r\n- Removed esoteric text from vol 0 a shell for a ghost\r\n - Loosing ego to find perspective. Stepping into the shoes of others to help us see things from theor perspective helps us understand how to better communicate eoth them (LMWC). We can then use these same principles (what do we think they would think about situation X) to figure out howbest to communicate with ourselves. Thought communication protocol can be used for both communication with other entities and with the self. This chapter we will have to figure out how to dive into this perspective shift. Just talk a little about how we need to truly drop any preconceived notions of who the self is. Because everyone is just a different construct in everyone elses head. There is no one self. Because we exist within the realities of everyone else as well. Which means when the next decision on the collective reality is made, (that tick and tock when we all take the lock will come into play later, when we max out that collective good decision making) we all instantiate effectively as it lives within the actived and deactived signals within the architecture. We never exist again in the same form. We collectively approach infinity by nature of life itself being the only constant we know. Life exists to create more life, it is transport itself, it is the truth we know inside ourself of ourself if we are able to step outside the self and look back at it. This is the shell for the Ghost. The Ghost is you, the soul. The Trinity is therefore the transport (soul, ghost, strategic principles, upstream), entity (self, body, overlayed conscious / cached states), and the architecture (humans, Open Architecture, brain / mind, not sure if orchestrator fits here; possibly when orchestration is bound by underlying description of architecture, the perpetual search(er) for the cleanest architecture: Alice).\r\n- Jenkins\r\n - https://github.com/jenkinsci/jenkinsfile-runner\r\n - Noticed mention of building on Pull request\r\n - Publish incremental releases for pull requests\r\n - https://github.com/jenkinsci/jenkinsfile-runner/releases/tag/1.0-beta-30\r\n - https://github.com/jenkinsci/jep/tree/master/jep/305\r\n - https://github.com/jenkinsci/jenkinsfile-runner/pull/525\r\n - https://github.com/jenkinsci/custom-war-packager/#configuration-file\r\n - Use this to add plugins\r\n - https://github.com/jenkinsci/jenkinsfile-runner/tree/main/demo/cwp\r\n - https://github.com/jenkinsci/jenkinsfile-runner/tree/main/demo/pipeline-as-yaml\r\n - https://github.com/jenkinsci/jenkinsfile-runner/pull/651\r\n - https://plugins.jenkins.io/pipeline-as-yaml/\r\n - https://github.com/jenkinsci/custom-war-packager/tree/master/demo/jenkinsfile-runner\r\n - Running this since we have k8s\r\n- Secure software factory\r\n - Goal: Roll container images and publish events to stream of consciousness\r\n - References\r\n - https://github.com/cncf/tag-security/blob/main/supply-chain-security/secure-software-factory/secure-software-factory.md\r\n - https://buildsec.github.io/frsca/\r\n - https://swagitda.com/blog/posts/security-decision-trees-with-graphviz/\r\n - https://www.cncf.io/blog/2022/09/14/protect-the-pipe-secure-ci-cd-pipelines-with-a-policy-based-approach-using-tekton-and-kyverno/\r\n - https://cloudnativesecurityconna22.sched.com/event/1AOkI"
}
]
},
{
"body": "# 2022-09-16\r\n\r\n- John under weather",
"replies": []
},
{
"body": "2",
"replies": [
{
"body": "- Dont forget about the webhooks on all the repos for the central webhook server / stream of consciousness!\r\n- Proxy PyPi extra index to github as a workaround for dependency links?\r\n- https://docs.google.com/document/d/1Ku6y50fY-ZktcUegeCnXLsksEWbaJZddZUxa9z1ehgY/edit\r\n- Still feeling shitty"
}
]
},
{
"body": "",
"replies": [
{
"body": "\r\n- John still feeling shitty"
}
]
},
{
"body": "# 2022-09-19 Engineering Logs\r\n\r\n- TODO\r\n - [ ] Auto increasing symver via hash of `__code__` of ops",
"replies": [
{
"body": "## 2022-09-19 @pdxjohnny Engineering Logs\r\n\r\n- gather and share knowledge\r\n- Configloaders as DataFlow as class add filename to inputs and then also allow for passing \r\n - Idea: DataFlow as Class as function invocation. This would allow you to invoke a python file with only functions. Like kwargs call wraps return of async for run\r\n - import funcname from dffml.call.asyncfunc.dataflow.path\r\n - Oh, were just manually working through the auto refactoring process by starting with the end state\r\n- policy based acceptable risk benefit of the doubt\r\n - be nice, knock and the door shall be opened, karma, pay it forward\r\n - except when risk analysis yields unacceptable results to umbrella/gatekeeper \r\n- Rememeber, we always think in parallel N dimensional interconnected graphs over time\r\n - Align reward to timeline (drop dead dates) to \r\n - Landing many planes at many airports at the same time, how do you reward work so that they all land perfectly timed?\r\n - Look to cooking for insipration on how to make several supply chains some with simialr (interconnections between nodes in graph) data (ingredeiants). Run trials, stream for data retention. Add in ingrediant expiration to account for timeline slip / expiration.\r\n - Is there a way we could incorperate oppertunity cost with this metaphor?\r\n - Cost of food expired - schedule slip\r\n - \r\n - Analyze post stream to build mermaid graphs to or some kind of visualization\r\n- Transparency brings us closer to speed of thought execution\r\n- Project management\r\n - Doc Deck on rewarding alignment for DFFML community to organize\r\n - Source material from thread:\r\n - `grep -i align`\r\n - `grep -i reward`\r\n- first manual taging / labeling / classification for issues, then models"
},
{
"body": "## 2022-09-19 Alice Architecture\r\n\r\n- TODO\r\n - [ ] Write a function that takes a `DataFlow` and produces another `DataFlow`\r\n that is not executable, but is conceptual, an upleveling of the underlying\r\n flow.\r\n - [ ] Write tutorial on how we do this\r\n - [ ] Start with static mapping\r\n - [ ] Operation which inserts operations within dataflow into input network (via return)\r\n - [ ] Optional chains of thought (links between data) can be formed by downstream operations\r\n which take the output of `running_context_dataflow_operations`. The output is of type\r\n `Operation`, `expand` is used on the `@op`.\r\n\r\n```mermaid\r\ngraph TD\r\n cli\r\n please_contribute_recommended_community_standards\r\n\r\n cli --> please_contribute_recommended_community_standards\r\n \r\n```"
}
]
},
{
"body": "# 2022-09-20 Engineering Log\r\n\r\n- https://github.com/TheAliceProject\r\n - > The Alice Project at Carnegie Mellon University's Entertainment Technology Center is dedicated to creating tools to teach computer science through creativity. http://alice.org/\r\n- https://fluxcd.io/blog/2022/08/manage-kyverno-policies-as-ocirepositories/\r\n - Admission control k8s policy controller with kyverno storing policies as artifacts in oci reg\r\n - Could we have sbom stored as povenace for policy?\r\n - Sbom for policy includes data sets and docs and org contacts\r\n- The cells are working together\r\n - ad-hoc over time (within lifetime tick and tock, mutation/fork/downstream/patched/evolution) distributed by function\r\n - Communication through both peer to peer and central stream of consiousness\r\n- analogy using LTMs and OpenSSF scorecard and LEED certification\r\n - https://support.usgbc.org/hc/en-us/articles/4404406912403-What-is-LEED-certification-#LEED\r\n - Analogy point is focus on time (beyond the onion security model, defense in depth pver tome requires maintainance)\r\n- time for kcp stream!\r\n - https://twitter.com/lorenc_dan/status/1572181327788777476?s=20&t=dvaRWcxul3i94V8vqYMG9A\r\n - Kcp spec as manifest reverse proxy to jenkins\r\n - KCP on top of OpenFaaS managed by ArgoCD\r\n - Alice creates PRs to state config\r\n - SBOMS: https://github.com/opensbom-generator/spdx-sbom-generator/blob/main/examples/modules.json\r\n - DERP (see https://goto.intel.com/devenvdocs deployment engineering logs)\r\nWe can use this as the stream proxy (everything speaks HTTP)\r\n\r\n![TrinityCalls](https://user-images.githubusercontent.com/5950433/191273573-c5a805d5-48e9-49cc-aa84-680ded4b401f.gif)\r\n\r\n- Lock established\r\n - Model mixes via Overlays and DataFlow as class\r\n - stable diffusion examples\r\n- Rewarding alignment doc deck\r\n - https://www.sphinx-doc.org/en/master/usage/builders/index.html#sphinx.builders.latex.LaTeXBuilder\r\n- Use case doc\r\n- Need faster way to edit github discussion as markdown\r\n - Could we do `python -m rich.markdown FILENAME` on one side and a reupload on the other?\r\n - Problem: drag and drop pictures\r\n - https://rich.readthedocs.io/en/stable/markdown.html\r\n- https://github.com/guacsec/guac\r\n - Similar to SCITT\r\n - Will collaberate with them\r\n - OA is essentially adding policy to assit with managing lifecycle (patching vulns and retesting downstreams and rereleasing defined in Part / checjed via policy)\r\n- TODO\r\n - [ ] Type up context aware policy notes",
"replies": [
{
"body": "- https://w3c-ccg.github.io/meetings/2022-09-20-traceability/\r\n - Orie in here it looks like"
}
]
},
{
"body": "# 2022-09-21 Engineering Logs\r\n\r\n- We are on DevMesh!\r\n - https://devmesh.intel.com/projects/alice\r\n- https://www.linkedin.com/posts/activity-6978347010844225536-2PFL/\r\n- https://chaoss.community/metrics/\r\n\r\n![image](https://user-images.githubusercontent.com/5950433/191525098-951bc7fb-dd47-47b2-a8c3-1199500f570d.png)\r\n",
"replies": [
{
"body": "## 2022-09-21 @pdxjohnny Engineering Log\r\n\r\n- Created profile on DevMesh\r\n - https://devmesh.intel.com/users/john-andersen-641a39/\r\n- Vol 3 (On Mind Control): Exploiting Bureaucracy: Wording Is Everything\r\n - https://devmesh.intel.com/projects/congress-bill-creator-oneapi-nlp-project#about-section\r\n- Funding model work: For feature requests measure references from other issues to measure downstream impact\r\n- The chaos god provides. It ends not with a bang, but with a\r\n - https://github.com/openai/whisper\r\n - Chaos, down the rabbit hole\r\n - Once again we\u2019ve arrived at the same conclusion.\r\n - atoms flip grep"
}
]
},
{
"body": "# 2022-09-22 Engineering Logs\r\n\r\n- Gnosticism & The Supreme Reality - Alan Watts\r\n - https://anchor.fm/sabrina-borja/episodes/Gnosticism--The-Supreme-Reality---Alan-Watts-eehqgr\r\n - https://anchor.fm/s/1351bf54/podcast/rss\r\n - https://d3ctxlq1ktw2nl.cloudfront.net/staging/2020-05-25/24a16eaddc18ff58c96e24bee0faf6b8.m4a\r\n - Time for whisper\r\n\r\n```console\r\n$ curl -sfL https://anchor.fm/s/1351bf54/podcast/rss | tee podcasts.rss.xml\r\n$ grep -C 4 '\\.m' podcasts.rss.xml | grep -A 5 Gnos\r\n <link>https://anchor.fm/sabrina-borja/episodes/Gnosticism--The-Supreme-Reality---Alan-Watts-eehqgr</link>\r\n <guid isPermaLink=\"false\">6f19c9d0-5d94-4858-8387-1cec43c39569</guid>\r\n <dc:creator><![CDATA[Sabrina Borja]]></dc:creator>\r\n <pubDate>Mon, 25 May 2020 14:42:18 GMT</pubDate>\r\n <enclosure url=\"https://anchor.fm/s/1351bf54/podcast/play/14264283/https%3A%2F%2Fd3ctxlq1ktw2nl.cloudfront.net%2Fstaging%2F2020-05-25%2F24a16eaddc18ff58c96e24bee0faf6b8.m4a\" length=\"50094380\" type=\"audio/x-m4a\"/>\r\n <itunes:summary>&lt;p&gt;Alan Watts talks about the gnosticism and the supreme reality&lt;/p&gt;\r\n```\r\n\r\n- compute\r\n - to go from the state of unknown to the state of known\r\n - pursuit of knowledge",
"replies": [
{
"body": "## 2022-09-22 @pdxjohnny Engineering Logs\r\n\r\n- ashes to ashes dust to dust, from beyond chaos we came and to beyond chaos shall we return. \u231b\ufe0f\r\n - Falling through to the other side of the hourglass.\r\n - Remember we've gone down the rabbit hole.\r\n - We'll go out through the looking glass.\r\n\r\n![alice-through-rabbit-hole-eye-of-hourglass](https://user-images.githubusercontent.com/5950433/191897229-0cd824ad-5368-45ce-8f60-c9aa814cdfd0.gif)\r\n\r\n- k8s (job orchestrator, cloud dev envs, etc.)\r\n - https://kubernetes.io/docs/reference/node/kubelet-checkpoint-api/\r\n - Requires `Kubernetes v1.25 [alpha]`\r\n- [Architecting Alice: Writing the Wave](https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0004_writing_the_wave.md)\r\n - https://github.com/intel/dffml/commit/baa1e2b986afb48325be379c60612c9c4aac7651\r\n - https://github.com/intel/dffml/blob/alice/docs/arch/alice/discussion/0023/reply_0055.md\r\n- [Troubleshooting Failed `pip install` Commands](https://github.com/intel/dffml/discussions/1406#discussioncomment-3710985)\r\n- Resources\r\n - Badges\r\n - https://shields.io/\r\n- Misc.\r\n - Gustav: https://www.lyrics.com/lyric/10511458/Alice%27s+Restaurant"
},
{
"body": "# Architecting Alice: Writing the Wave\r\n\r\n> Moved to: https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0004_writing_the_wave.md\r\n\r\n> This tutorial was written by echoing examples to the shell, then code blocking the relevant console commands. We're going to use what we build here to allow tutorial writers to either speak the echo commands and we'll later insert them into the asciinema recordings we scrape the commands and outputs from. We could also use the date on the filename we record too plus the offsets to calculate point in time for a given recording. asciicast recordings have new content with a time delta stamp from the last read/write, we probably need to ensure recording are not made with `--idle-time-limit` for this. If we can get streaming working for the lines of asciinema output, critical piece here is ensuring writes are flushed on each line asciinema side, pretty sure this is the case but we need to check. Then we could potentially run these updates markdown comments realtime, Alice doing it sitting alongside of course.\r\n\r\nWe want Alice to be as easy to communicate with as possible so\r\nthat she can be the most helpful possible.\r\n\r\nWe'll be using text to a speech to text model from OpenAI known\r\nas Whisper provide Alice with additional context / input data.\r\nIn future tutorials we'll leverage what we teach Alice here\r\n\r\n## The Time is Come for Thee to Reap\r\n\r\nA good friend to us all, John Van Sickle, whose ffmpeg static\r\nbuilds have saved many of us from an ungodly amount of time\r\nspent in dependency hell.\r\n\r\nWe'll be calling on John today, or well, his HTTP server, to\r\nprovide us with what we all want, ffmpeg that \"just works\".\r\nWhisper requires that we have ffmpeg installed and asking John\r\nfor a binary is usually the easiest way to make that happen.\r\n\r\n```console\r\n$ curl -sfLOC - https://johnvansickle.com/ffmpeg/releases/ffmpeg-release-amd64-static.tar.xz\r\n$ tar xvf ffmpeg-release-amd64-static.tar.xz\r\n```\r\n\r\nMove the downloaded files into a user local binary directory,\r\nwe're sure to have permissions to write here.\r\n\r\n```console\r\n$ mkdir -p ~/.local/bin/\r\n$ mv ffmpeg-5.1.1-amd64-static/{ffmpeg,ffprobe,qt-faststart} ~/.local/bin/\r\n```\r\n\r\nAdd the directory to your `PATH` to ensure you can run the binaries\r\nwe put in there.\r\n\r\n```console\r\n$ export PATH=\"${PATH}:${HOME}/.local/bin\"\r\n```\r\n\r\nAdd the PATH modification to the shell's startup scripts to ensure\r\n*new* shells also know where to get those binaries so as to run them.\r\n\r\n```console\r\n$ echo -e 'export PATH=\"${PATH}:${HOME}/.local/bin\"' | tee -a ~/.bashrc ~/.bash_profile\r\n```\r\n\r\nTry running `ffmpeg`, you should see output similar to the following.\r\n\r\n```console\r\n$ ffmpeg\r\nffmpeg version 5.1.1-static https://johnvansickle.com/ffmpeg/ Copyright (c) 2000-2022 the FFmpeg developers\r\n built with gcc 8 (Debian 8.3.0-6)\r\n configuration: --enable-gpl --enable-version3 --enable-static --disable-debug --disable-ffplay --disable-indev=sndio --disable-outdev=sndio --cc=gcc --enable-fontconfig --enable-frei0r --enable-gnutls --enable-gmp --enable-libgme --enable-gray --enable-libaom --enable-libfribidi --enable-libass --enable-libvmaf --enable-libfreetype --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-librubberband --enable-libsoxr --enable-libspeex --enable-libsrt --enable-libvorbis --enable-libopus --enable-libtheora --enable-libvidstab --enable-libvo-amrwbenc --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libdav1d --enable-libxvid --enable-libzvbi --enable-libzimg\r\n libavutil 57. 28.100 / 57. 28.100\r\n libavcodec 59. 37.100 / 59. 37.100\r\n libavformat 59. 27.100 / 59. 27.100\r\n libavdevice 59. 7.100 / 59. 7.100\r\n libavfilter 8. 44.100 / 8. 44.100\r\n libswscale 6. 7.100 / 6. 7.100\r\n libswresample 4. 7.100 / 4. 7.100\r\n libpostproc 56. 6.100 / 56. 6.100\r\nHyper fast Audio and Video encoder\r\nusage: ffmpeg [options] [[infile options] -i infile]... {[outfile options] outfile}...\r\n\r\nUse -h to get full help or, even better, run 'man ffmpeg'\r\n```\r\n\r\nThanks again John!\r\n\r\n## Not With a Bang, but With a Whisper\r\n\r\nOpenAI does some cool stuff! They released a model we'll be wrapping\r\nas an operation, first we'll do some basic setup and usage of their\r\ntext to speech code / model called Whisper.\r\n\r\n- References\r\n - https://github.com/openai/whisper\r\n - https://github.com/openai/whisper/blob/e90b8fa7e845ae184ed9aa0babcf3cde6f16719e/README.md\r\n- Troubleshooting\r\n - If pytorch/troch fails to download try downloading and installing separately it to see if that helps.\r\n - https://github.com/intel/dffml/discussions/1406#discussioncomment-3710985\r\n\r\nCheck their page for the most up to date information on how to install it.\r\n\r\n```console\r\n$ pip install git+https://github.com/openai/whisper.git\r\nDefaulting to user installation because normal site-packages is not writeable\r\nCollecting git+https://github.com/openai/whisper.git\r\n Cloning https://github.com/openai/whisper.git to /tmp/pip-req-build-1x3f7bij\r\n Running command git clone --filter=blob:none --quiet https://github.com/openai/whisper.git /tmp/pip-req-build-1x3f7bij\r\no Resolved https://github.com/openai/whisper.git to commit e90b8fa7e845ae184ed9aa0babcf3cde6f16719e\r\n Preparing metadata (setup.py) ... done\r\nCollecting numpy\r\n Using cached numpy-1.23.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (17.1 MB)\r\nRequirement already satisfied: torch in ./.local/lib/python3.9/site-packages (from whisper==1.0) (1.12.1)\r\nCollecting tqdm\r\n Downloading tqdm-4.64.1-py2.py3-none-any.whl (78 kB)\r\n \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 78.5/78.5 kB 11.1 MB/s eta 0:00:00\r\nCollecting more_itertools\r\n Downloading more_itertools-8.14.0-py3-none-any.whl (52 kB)\r\n \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 52.2/52.2 kB 18.7 MB/s eta 0:00:00\r\nCollecting transformers>=4.19.0\r\n Downloading transformers-4.22.1-py3-none-any.whl (4.9 MB)\r\n \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 4.9/4.9 MB 22.8 MB/s eta 0:00:00\r\nCollecting ffmpeg-python==0.2.0\r\n Downloading ffmpeg_python-0.2.0-py3-none-any.whl (25 kB)\r\nCollecting future\r\n Downloading future-0.18.2.tar.gz (829 kB)\r\n \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 829.2/829.2 kB 51.4 MB/s eta 0:00:00\r\n Preparing metadata (setup.py) ... done\r\nRequirement already satisfied: packaging>=20.0 in ./.local/lib/python3.9/site-packages (from transformers>=4.19.0->whisper==1.0) (21.3)\r\nRequirement already satisfied: pyyaml>=5.1 in ./.local/lib/python3.9/site-packages (from transformers>=4.19.0->whisper==1.0) (6.0)\r\nCollecting tokenizers!=0.11.3,<0.13,>=0.11.1\r\n Downloading tokenizers-0.12.1-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (6.6 MB)\r\n \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 6.6/6.6 MB 23.8 MB/s eta 0:00:00\r\nRequirement already satisfied: regex!=2019.12.17 in ./.local/lib/python3.9/site-packages (from transformers>=4.19.0->whisper==1.0) (2022.7.25)\r\nCollecting filelock\r\n Downloading filelock-3.8.0-py3-none-any.whl (10 kB)\r\nRequirement already satisfied: requests in ./.local/lib/python3.9/site-packages (from transformers>=4.19.0->whisper==1.0) (2.28.1)\r\nCollecting huggingface-hub<1.0,>=0.9.0\r\n Downloading huggingface_hub-0.9.1-py3-none-any.whl (120 kB)\r\n \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 120.7/120.7 kB 15.8 MB/s eta 0:00:00\r\nRequirement already satisfied: typing-extensions in ./.local/lib/python3.9/site-packages (from torch->whisper==1.0) (4.3.0)\r\nRequirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in ./.local/lib/python3.9/site-packages (from packaging>=20.0->transformers>=4.19.0->whisper==1.0) (3.0.9)\r\nRequirement already satisfied: charset-normalizer<3,>=2 in ./.local/lib/python3.9/site-packages (from requests->transformers>=4.19.0->whisper==1.0) (2.1.0)\r\nRequirement already satisfied: idna<4,>=2.5 in ./.local/lib/python3.9/site-packages (from requests->transformers>=4.19.0->whisper==1.0) (3.3)\r\nRequirement already satisfied: certifi>=2017.4.17 in ./.local/lib/python3.9/site-packages (from requests->transformers>=4.19.0->whisper==1.0) (2022.6.15)\r\nRequirement already satisfied: urllib3<1.27,>=1.21.1 in ./.local/lib/python3.9/site-packages (from requests->transformers>=4.19.0->whisper==1.0) (1.26.11)\r\nBuilding wheels for collected packages: whisper, future\r\n Building wheel for whisper (setup.py) ... done\r\n Created wheel for whisper: filename=whisper-1.0-py3-none-any.whl size=1173962 sha256=2972ec82594a159a312f32a82c755a0aa9d896d2fbcfe4e517d2df89d0ac9dc4\r\n Stored in directory: /tmp/pip-ephem-wheel-cache-42cy9_3c/wheels/fe/03/29/e7919208d11b4ab32972cb448bb84a9a675d92cd52c9a48341\r\n Building wheel for future (setup.py) ... done\r\n Created wheel for future: filename=future-0.18.2-py3-none-any.whl size=491058 sha256=8cd76024b97611296081328e7fbcfe960b3b533abba60af5bf5e1ecdd959070d\r\n Stored in directory: /home/coder/.cache/pip/wheels/2f/a0/d3/4030d9f80e6b3be787f19fc911b8e7aa462986a40ab1e4bb94\r\nSuccessfully built whisper future\r\nInstalling collected packages: tokenizers, tqdm, numpy, more_itertools, future, filelock, huggingface-hub, ffmpeg-python, transformers, whisper\r\nSuccessfully installed ffmpeg-python-0.2.0 filelock-3.8.0 future-0.18.2 huggingface-hub-0.9.1 more_itertools-8.14.0 numpy-1.23.3 tokenizers-0.12.1 tqdm-4.64.1 transformers-4.22.1 whisper-1.0\r\n```\r\n\r\nThe model downloads on first load, so we need a one off python\r\ncommand to trigger the download. This block of code will be\r\nused on operation implementation context entry.\r\n\r\n- References\r\n - https://intel.github.io/dffml/main/examples/shouldi.html#pypi-operations\r\n\r\n```console\r\n$ python -uc 'import whisper; whisper.load_model(\"base\")'\r\nThe cache for model files in Transformers v4.22.0 has been updated. Migrating your old cache. This is a one-time only operation. You can interrupt this and resume the migration later on by calling `transformers.utils.move_cache()`.\r\nMoving 0 files to the new cache system\r\n0it [00:00, ?it/s]\r\n100%|\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588| 139M/139M [00:02<00:00, 61.9MiB/s]\r\n```\r\n\r\nGreat! The model downloaded using our one off command.\r\n\r\nLet's try running an audio file through for transcription.\r\n\r\nWhile falling down the rabbit hole we came across an interesting\r\nrecording from our good friend, Alan Watts. We'd love to save\r\nknowledge contained in it for easy reference and use later.\r\n\r\n- Gnosticism & The Supreme Reality - Alan Watts\r\n - https://anchor.fm/sabrina-borja/episodes/Gnosticism--The-Supreme-Reality---Alan-Watts-eehqgr\r\n\r\n### RSS feed us the Audio file please and thank you\r\n\r\n[![hack-the-planet](https://img.shields.io/badge/hack%20the-planet-blue)](https://github.com/intel/dffml/discussions/1406#discussioncomment-3711548)\r\n\r\nFrom the webpage we found a RSS URL for the podcast.\r\n\r\n- We download the RSS feed\r\n - `curl -sfL https://example.com/rss`\r\n- Filter for `.mp4` or `.mp3` references\r\n - `grep -C 4 '\\.m'`\r\n- Filter once more for a word from the title we are looking for\r\n - `grep -A 5 -i Gnosticism`\r\n\r\n```console\r\n$ curl -sfL https://anchor.fm/s/1351bf54/podcast/rss | grep -C 4 '\\.m' | grep -C 5 -i Gnosticism\r\n <itunes:summary>&lt;p&gt;Alan Watts questions if we are still thinking&lt;/p&gt;\r\n\r\n---\r\n\r\n--\r\n <link>https://anchor.fm/sabrina-borja/episodes/Gnosticism--The-Supreme-Reality---Alan-Watts-eehqgr</link>\r\n <guid isPermaLink=\"false\">6f19c9d0-5d94-4858-8387-1cec43c39569</guid>\r\n <dc:creator><![CDATA[Sabrina Borja]]></dc:creator>\r\n <pubDate>Mon, 25 May 2020 14:42:18 GMT</pubDate>\r\n <enclosure url=\"https://anchor.fm/s/1351bf54/podcast/play/14264283/https%3A%2F%2Fd3ctxlq1ktw2nl.cloudfront.net%2Fstaging%2F2020-05-25%2F24a16eaddc18ff58c96e24bee0faf6b8.m4a\" length=\"50094380\" type=\"audio/x-m4a\"/>\r\n <itunes:summary>&lt;p&gt;Alan Watts talks about the gnosticism and the supreme reality&lt;/p&gt;\r\n\r\n---\r\n\r\n--\r\n <link>https://anchor.fm/sabrina-borja/episodes/What-Do-You-Desire----Alan-Watts-eehn6o</link>\r\n```\r\n\r\nLet's download recording using the URL to the `.m4a` we found.\r\n\r\n```console\r\n$ curl -sfLC - -o alan-watts-gnosticism.m4a https://anchor.fm/s/1351bf54/podcast/play/14264283/https%3A%2F%2Fd3ctxlq1ktw2nl.cloudfront.net%2Fstaging%2F2020-05-25%2F24a16eaddc18ff58c96e24bee0faf6b8.m4a\r\n```\r\n\r\nWe'll double check \r\n\r\n```console\r\n$ file alan-watts-gnosticism.m4a\r\nalan-watts-gnosticism.m4a: ISO Media, MP4 Base Media v1 [IS0 14496-12:2003]\r\n```\r\n\r\n[![write-the-docs](https://img.shields.io/badge/write%20the-docs-success)](https://github.com/intel/dffml/discussions/1406#discussioncomment-3711548)\r\n\r\nCalculate the SHA, when we wrote the docs for this we ran the following\r\ncommand to calculate a cryptographic hash of the contents of the file.\r\nIn the next command, we use the hash captured at time of writing the tutorial\r\nand ask the `sha384sum` command to verify that the contents of the file\r\nmatch the expected hash.\r\n\r\nIf you're writing more tutorials for Alice, you'll want to calculate the hash\r\nof a files you use so that others can verify that they downloaded the same file\r\nyou did! We don't want anyone to get confused at why something doesn't work,\r\nsimply because the file they downloaded didn't have the expected contents!\r\n\r\n```console\r\n$ sha384sum alan-watts-gnosticism.m4a\r\ndb9504a15b19bac100093fffe69ce2ab6dd7ed017978c7afcf6ff70db0f288c56b470224e4bcc8b23b927029de13d60a alan-watts-gnosticism.m4a\r\n```\r\n\r\n[![mindset-security](https://img.shields.io/badge/mindset-security-critical)](https://github.com/intel/dffml/discussions/1406#discussioncomment-3711548)\r\n\r\nVerify the contents are as expected, you can check the output of the\r\nprevious command to make sure the hash you see matches these docs. You\r\ncan also run the next command which will fail if the contents are do not\r\nmatch the hash provided here via `<<<`.\r\n\r\n```console\r\n$ sha384sum -c - <<< 'db9504a15b19bac100093fffe69ce2ab6dd7ed017978c7afcf6ff70db0f288c56b470224e4bcc8b23b927029de13d60a alan-watts-gnosticism.m4a'\r\nalan-watts-gnosticism.m4a: OK\r\n```\r\n\r\nNow that we have our audio file, let's try transcription.\r\nFirst we reduce the length of the recording to be transcribed\r\nso that this goes faster.\r\n\r\n```console\r\n$ ffmpeg -t 60 -i alan-watts-gnosticism.m4a -acodec copy alan-watts-gnosticism-first-60-seconds.m4a\r\n```\r\n\r\nNow we'll ask whisper to transcribe those first 60 seconds for us.\r\nThis took about an hour on first run.\r\n\r\n- Troubleshooting\r\n - Troubleshooting Failed Whisper Transcriptions\r\n - https://github.com/intel/dffml/discussions/1406#discussioncomment-3711966\r\n\r\n```console\r\n$ python -uc 'import sys, whisper; print(whisper.load_model(\"base\").transcribe(sys.argv[-1])[\"text\"])' alan-watts-gnosticism-first-60-seconds.m4a\r\n/home/coder/.local/lib/python3.9/site-packages/whisper/transcribe.py:70: UserWarning: FP16 is not supported on CPU; using FP32 instead\r\n warnings.warn(\"FP16 is not supported on CPU; using FP32 instead\")\r\nDetected language: english\r\n\r\n\r\n Of course, what we've been talking about is not so much a set of ideas as an experience, or shall we say, experiencing. And this kind of seminar in comparison with encounter groups or workshops of various kinds or experiments in sensory awareness is now being called a conceptual seminar. Although I'm not talking about concepts, but the crucial question arises that an understanding, a real feeling understanding of the polar relationship between the\r\n```\r\n\r\nLet's try with the tiny english only model and see if that speeds\r\nthings up.\r\n\r\n```console\r\n$ python -uc 'import whisper; whisper.load_model(\"tiny.en\")'\r\nThe cache for model files in Transformers v4.22.0 has been updated. Migrating your old cache. This is a one-time only operation. You can interrupt this and resume the migration later on by calling `transformers.utils.move_cache()`.\r\nMoving 0 files to the new cache system\r\n0it [00:00, ?it/s]\r\n100%|\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588| 139M/139M [00:02<00:00, 61.9MiB/s]\r\n```\r\n\r\nWe'll add the `language=\"en\"` decode option to our call to\r\n`model.transcribe()`.\r\n\r\n- References\r\n - https://github.com/openai/whisper/blob/e90b8fa7e845ae184ed9aa0babcf3cde6f16719e/whisper/__main__.py#L1-L4\r\n - https://github.com/openai/whisper/blob/e90b8fa7e845ae184ed9aa0babcf3cde6f16719e/whisper/transcribe.py#L78\r\n\r\n```console\r\n$ time python -uc 'import sys, whisper; print(whisper.load_model(\"tiny.en\").transcribe(sys.argv[-1], language=\"en\")[\"text\"])' alan-watts-gnosticism-first-60-seconds.m4a\r\n/home/coder/.local/lib/python3.9/site-packages/whisper/transcribe.py:70: UserWarning: FP16 is not supported on CPU; using FP32 instead\r\n warnings.warn(\"FP16 is not supported on CPU; using FP32 instead\")\r\n Of course, what we've been talking about is not so much a set of ideas as an experience, or shall we say experiencing. And this kind of seminar in comparison with encounter groups or workshops of various kinds or experiments in sensory awareness is now being called a conceptual seminar. Although I'm not talking about concepts, but the crucial question arises that an understanding, a real feeling understanding of the polar relationship between the\u2026\r\n\r\nreal 15m33.964s\r\nuser 4m41.394s\r\nsys 0m14.513s\r\n```\r\n\r\n## Into the Ether\r\n\r\nJust like us, Alice thinks in parallel. We can't very well\r\nhave all Alice's time being spent transcribing audio files.\r\nWe need her help with too many things for that. We are about to\r\nteach her how to transcribe for us in the background, using\r\na different CPU thread.\r\n\r\nAt the time of writing this tutorial Alice's orchestration is\r\nable to run concurrent operations but does not transparently\r\nrun non-concurrent (no `async`, just a `def`) operations within\r\nthreads so as to make them concurrent.\r\n\r\n- References\r\n - https://docs.python.org/3/library/threading.html\r\n\r\n> Eventually the orchestrator will be updated so that it takes op kwargs and decides if it should run it in a thread or not. **TODO** We need an issue to track this.\r\n> - References\r\n> - https://github.com/intel/dffml/issues/245\r\n\r\n[![use-the-source](https://img.shields.io/badge/use%20the-source-blueviolet)](https://github.com/intel/dffml/discussions/1406#discussioncomment-3711548)\r\n\r\nThere is an example within the DFFML source code which we can pull\r\nfrom, if only we could find it first...\r\n\r\nLet's head over to a copy of DFFML and look for what we want, any\r\nmention of \"thread\".\r\n\r\n```console\r\n$ cd /src/dffml\r\n$ git grep -i thread\r\n```\r\n\r\nIn the output we see:\r\n\r\n```console\r\nfeature/auth/dffml_feature_auth/feature/operations.py: illustrate threading. 100000 is probably not enough iterations!!!\r\nfeature/auth/dffml_feature_auth/feature/operations.py: # we submit to the thread pool. Weird behavior can happen if we raise in\r\nfeature/auth/dffml_feature_auth/feature/operations.py: self.pool = concurrent.futures.ThreadPoolExecutor()\r\n```\r\n\r\nAs mentioned by the [Python documentation on threading](https://docs.python.org/3/library/threading.html),\r\nwe see the use of [`concurrent.futures.ThreadPoolExecutor`](https://docs.python.org/3/library/concurrent.futures.html#concurrent.futures.ThreadPoolExecutor).\r\n\r\nOur example code is as follows, we'll copy directly from it but replace\r\nthe call to `self.hash_password`, a non-concurrent function, with our\r\ntranscription function.\r\n\r\nhttps://github.com/intel/dffml/blob/9f06bae59e954e5fe0845d416500d8418b5907bf/feature/auth/dffml_feature_auth/feature/operations.py#L101-L134\r\n\r\n- TODO\r\n - [ ] Stream input\r\n - [ ] Stream output\r\n - [ ] Fix\r\n - [ ] Configurable yield break points (via overlay based replacement of op? or config at a minimum similar to `\\n` on `StreamReader.readline()`)"
},
{
"body": "## Troubleshooting Failed `pip install` Commands\r\n\r\n### Context\r\n\r\nSometimes downloading a package with pip will fail.\r\n\r\n```console\r\n$ ulimit -c unlimited\r\n$ python -m pip download torch\r\nCollecting torch\r\n Downloading torch-1.12.1-cp39-cp39-manylinux1_x86_64.whl (776.4 MB)\r\n \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2578 776.3/776.4 MB 13.0 MB/s eta 0:00:01Killed\r\n```\r\n\r\n### Possible Solution: Manual Install of Problematic Python Dependency\r\n\r\n- This troubleshooting solution covers\r\n - Increase memory limit for processes (userspace)\r\n - Find the download URL of a python package\r\n - Download a python package with download resumption\r\n - Verify the contents of the package downloaded using a SHA\r\n - Install package from downloaded wheel\r\n\r\nLook for the path to the download you want.\r\n\r\n```console\r\n$ curl -sfL https://pypi.org/simple/torch/ | grep torch-1.12.1-cp39-cp39-manylinux1_x86_64.whl\r\n <a href=\"https://files.pythonhosted.org/packages/1e/2f/06d30fbc76707f14641fe737f0715f601243e039d676be487d0340559c86/torch-1.12.1-cp39-cp39-manylinux1_x86_64.whl#sha256=9b356aea223772cd754edb4d9ecf2a025909b8615a7668ac7d5130f86e7ec421\" data-requires-python=\"&gt;=3.7.0\" >torch-1.12.1-cp39-cp39-manylinux1_x86_64.whl</a><br />\r\n```\r\n\r\nDownload the package.\r\n\r\n```console\r\n$ curl -fLOC - https://files.pythonhosted.org/packages/1e/2f/06d30fbc76707f14641fe737f0715f601243e039d676be487d0340559c86/torch-1.12.1-cp39-cp39-manylinux1_x86_64.whl\r\n % Total % Received % Xferd Average Speed Time Time Time Current\r\n Dload Upload Total Spent Left Speed\r\n100 740M 100 740M 0 0 85.1M 0 0:00:08 0:00:08 --:--:-- 106M\r\n```\r\n\r\nVerify the SHA appended to our downloaded URL from our initial command.\r\n\r\n```console\r\n$ sha256sum -c - <<<'9b356aea223772cd754edb4d9ecf2a025909b8615a7668ac7d5130f86e7ec421 torch-1.12.1-cp39-cp39-manylinux1_x86_64.whl'\r\ntorch-1.12.1-cp39-cp39-manylinux1_x86_64.whl: OK\r\n```\r\n\r\nUpdate the package manager\r\n\r\n```console\r\n$ python -m pip install -U pip setuptools wheel\r\nDefaulting to user installation because normal site-packages is not writeable\r\nRequirement already satisfied: pip in /.pyenv/versions/3.9.13/lib/python3.9/site-packages (22.2.1)\r\nCollecting pip\r\n Downloading pip-22.2.2-py3-none-any.whl (2.0 MB)\r\n \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 2.0/2.0 MB 10.3 MB/s eta 0:00:00\r\nRequirement already satisfied: setuptools in /.pyenv/versions/3.9.13/lib/python3.9/site-packages (63.2.0)\r\nCollecting setuptools\r\n Downloading setuptools-65.3.0-py3-none-any.whl (1.2 MB)\r\n \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 1.2/1.2 MB 16.5 MB/s eta 0:00:00\r\nRequirement already satisfied: wheel in /.pyenv/versions/3.9.13/lib/python3.9/site-packages (0.37.1)\r\nInstalling collected packages: setuptools, pip\r\nSuccessfully installed pip-22.2.2 setuptools-65.3.0\r\n\r\n[notice] A new release of pip available: 22.2.1 -> 22.2.2\r\n[notice] To update, run: pip install --upgrade pip\r\n```\r\n\r\nInstall the package\r\n\r\n```console\r\n$ python -m pip install ./torch-1.12.1-cp39-cp39-manylinux1_x86_64.whl\r\n```\r\n\r\nNow it should appear to pip as installed.\r\n\r\n```console\r\n$ pip install torch==1.12.1\r\nDefaulting to user installation because normal site-packages is not writeable\r\nRequirement already satisfied: torch==1.12.1 in ./.local/lib/python3.9/site-packages (1.12.1)\r\nRequirement already satisfied: typing-extensions in ./.local/lib/python3.9/site-packages (from torch==1.12.1) (4.3.0)\r\n```"
},
{
"body": "# Rolling Alice: Easter Eggs\r\n\r\n> Moved to https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_easter_eggs.md\r\n\r\nEaster eggs are scattered throughout the Alice tutorials. Look for these\r\nbadges to explore aligned trains of thought. \r\n\r\n## [![write-the-docs](https://img.shields.io/badge/write%20the-docs-success)](https://github.com/intel/dffml/discussions/1406#discussioncomment-3711548)\r\n\r\nDocumentation writing tips, tricks, and alignment recommendations to ensure\r\nwe make it easy to write docs and understand how to fill their contents.\r\n\r\n## [![mindset-security](https://img.shields.io/badge/mindset-security-critical)](https://github.com/intel/dffml/discussions/1406#discussioncomment-3711548)\r\n\r\nSecurity focused content, pay extra attention here to help keep yourself\r\nand others safe!\r\n\r\n## [![use-the-source](https://img.shields.io/badge/use%20the-source-blueviolet)](https://github.com/intel/dffml/discussions/1406#discussioncomment-3711548)\r\n\r\nUsing existing project's source code in place of documentation when none is\r\navailable.\r\n\r\n## [![hack-the-planet](https://img.shields.io/badge/hack%20the-planet-blue)](https://github.com/intel/dffml/discussions/1406#discussioncomment-3711548)\r\n\r\nRandom navigation through systems, file formats, and patterns, that might be\r\nhelpful as you're out popping shells."
},
{
"body": "# Troubleshooting Failed Whisper Transcriptions\r\n\r\n- Try reducing the length of the recording to be transcribed in event of \"Killed\" (likely due to out of memory)\r\n\r\n```console\r\n$ ffmpeg -t 60 -i alan-watts-gnosticism.m4a -acodec copy alan-watts-gnosticism-first-60-seconds.m4a\r\nffmpeg version 5.1.1-static https://johnvansickle.com/ffmpeg/ Copyright (c) 2000-2022 the FFmpeg developers\r\n built with gcc 8 (Debian 8.3.0-6)\r\n configuration: --enable-gpl --enable-version3 --enable-static --disable-debug --disable-ffplay --disable-indev=sndio --disable-outdev=sndio --cc=gcc --enable-fontconfig --enable-frei0r --enable-gnutls --enable-gmp --enable-libgme --enable-gray --enable-libaom --enable-libfribidi --enable-libass --enable-libvmaf --enable-libfreetype --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-librubberband --enable-libsoxr --enable-libspeex --enable-libsrt --enable-libvorbis --enable-libopus --enable-libtheora --enable-libvidstab --enable-libvo-amrwbenc --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libdav1d --enable-libxvid --enable-libzvbi --enable-libzimg\r\n libavutil 57. 28.100 / 57. 28.100\r\n libavcodec 59. 37.100 / 59. 37.100\r\n libavformat 59. 27.100 / 59. 27.100\r\n libavdevice 59. 7.100 / 59. 7.100\r\n libavfilter 8. 44.100 / 8. 44.100\r\n libswscale 6. 7.100 / 6. 7.100\r\n libswresample 4. 7.100 / 4. 7.100\r\n libpostproc 56. 6.100 / 56. 6.100\r\nInput #0, mov,mp4,m4a,3gp,3g2,mj2, from 'alan-watts-gnosticism.m4a':\r\n Metadata:\r\n major_brand : isom\r\n minor_version : 512\r\n compatible_brands: isomiso2mp41\r\n encoder : Lavf58.24.101\r\n Duration: 00:51:37.36, start: 0.000000, bitrate: 129 kb/s\r\n Stream #0:0[0x1](und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 128 kb/s (default)\r\n Metadata:\r\n handler_name : SoundHandler\r\n vendor_id : [0][0][0][0]\r\nOutput #0, ipod, to 'alan-watts-gnosticism-first-60-seconds.m4a':\r\n Metadata:\r\n major_brand : isom\r\n minor_version : 512\r\n compatible_brands: isomiso2mp41\r\n encoder : Lavf59.27.100\r\n Stream #0:0(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 128 kb/s (default)\r\n Metadata:\r\n handler_name : SoundHandler\r\n vendor_id : [0][0][0][0]\r\nStream mapping:\r\n Stream #0:0 -> #0:0 (copy)\r\nPress [q] to stop, [?] for help\r\nsize= 948kB time=00:01:00.00 bitrate= 129.5kbits/s speed=7.14e+03x\r\nvideo:0kB audio:938kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 1.159434%\r\n$ file alan-watts-gnosticism-first-60-seconds.m4a\r\nalan-watts-gnosticism-first-60-seconds.m4a: ISO Media, Apple iTunes ALAC/AAC-LC (.M4A) Audio\r\n$ python -uc 'import sys, whisper; print(whisper.load_model(\"base\").transcribe(sys.argv[-1])[\"text\"])' alan-watts-gnosticism-first-60-seconds.m4a\r\n```\r\n\r\n\r\n```console\r\n$ ps faux\r\nUSER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND\r\ncoder 1 0.0 0.0 751808 9176 ? Ssl Sep19 0:21 ./coder agent\r\ncoder 6052 0.0 0.0 6100 4016 pts/12 Ss 16:44 0:00 \\_ -bash\r\ncoder 6391 34.7 0.2 4647032 731712 pts/12 Rl+ 18:43 5:36 | \\_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model(\"base\").transcribe(sys.argv[-1])[\"text\"]) alan-watts-gnosticism-first-60-seconds.m4a\r\ncoder 6520 0.0 0.0 5996 3948 pts/13 Ss 18:56 0:00 \\_ -bash\r\ncoder 6536 0.0 0.0 7648 3292 pts/13 R+ 18:59 0:00 \\_ ps faux\r\n```\r\n\r\n- Noticed the process is spending a lot of time sleeping.\r\n\r\n```console\r\n$ while test 1; do ps faux | grep whisper | grep -v grep | tee -a mem.txt; sleep 0.2; done\r\ncoder 6391 34.4 0.2 4647032 733600 pts/12 Rl+ 18:43 6:27 | \\_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model(\"base\").transcribe(sys.argv[-1])[\"text\"]) alan-watts-gnosticism-first-60-seconds.m4a\r\ncoder 6391 34.4 0.2 4647032 733600 pts/12 Rl+ 18:43 6:27 | \\_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model(\"base\").transcribe(sys.argv[-1])[\"text\"]) alan-watts-gnosticism-first-60-seconds.m4a\r\ncoder 6391 34.4 0.2 4647032 733600 pts/12 Sl+ 18:43 6:27 | \\_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model(\"base\").transcribe(sys.argv[-1])[\"text\"]) alan-watts-gnosticism-first-60-seconds.m4a\r\ncoder 6391 34.4 0.2 4647032 733600 pts/12 Sl+ 18:43 6:28 | \\_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model(\"base\").transcribe(sys.argv[-1])[\"text\"]) alan-watts-gnosticism-first-60-seconds.m4a\r\ncoder 6391 34.4 0.2 4647032 733600 pts/12 Rl+ 18:43 6:28 | \\_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model(\"base\").transcribe(sys.argv[-1])[\"text\"]) alan-watts-gnosticism-first-60-seconds.m4a\r\ncoder 6391 34.4 0.2 4647032 733600 pts/12 Sl+ 18:43 6:28 | \\_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model(\"base\").transcribe(sys.argv[-1])[\"text\"]) alan-watts-gnosticism-first-60-seconds.m4a\r\ncoder 6391 34.4 0.2 4647032 733600 pts/12 Rl+ 18:43 6:28 | \\_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model(\"base\").transcribe(sys.argv[-1])[\"text\"]) alan-watts-gnosticism-first-60-seconds.m4a\r\ncoder 6391 34.4 0.2 4647032 733600 pts/12 Sl+ 18:43 6:29 | \\_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model(\"base\").transcribe(sys.argv[-1])[\"text\"]) alan-watts-gnosticism-first-60-seconds.m4a\r\ncoder 6391 34.3 0.2 4647032 733600 pts/12 Sl+ 18:43 6:29 | \\_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model(\"base\").transcribe(sys.argv[-1])[\"text\"]) alan-watts-gnosticism-first-60-seconds.m4a\r\ncoder 6391 34.4 0.2 4647032 733600 pts/12 Rl+ 18:43 6:29 | \\_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model(\"base\").transcribe(sys.argv[-1])[\"text\"]) alan-watts-gnosticism-first-60-seconds.m4a\r\ncoder 6391 34.3 0.2 4647032 733600 pts/12 Rl+ 18:43 6:29 | \\_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model(\"base\").transcribe(sys.argv[-1])[\"text\"]) alan-watts-gnosticism-first-60-seconds.m4a\r\ncoder 6391 34.4 0.2 4647032 733600 pts/12 Sl+ 18:43 6:29 | \\_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model(\"base\").transcribe(sys.argv[-1])[\"text\"]) alan-watts-gnosticism-first-60-seconds.m4a\r\ncoder 6391 34.3 0.2 4647032 733600 pts/12 Rl+ 18:43 6:29 | \\_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model(\"base\").transcribe(sys.argv[-1])[\"text\"]) alan-watts-gnosticism-first-60-seconds.m4a\r\ncoder 6391 34.3 0.2 4647032 733600 pts/12 Rl+ 18:43 6:30 | \\_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model(\"base\").transcribe(sys.argv[-1])[\"text\"]) alan-watts-gnosticism-first-60-seconds.m4a\r\ncoder 6391 34.3 0.2 4647032 733600 pts/12 Rl+ 18:43 6:30 | \\_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model(\"base\").transcribe(sys.argv[-1])[\"text\"]) alan-watts-gnosticism-first-60-seconds.m4a\r\ncoder 6391 34.3 0.2 4647032 733600 pts/12 Sl+ 18:43 6:30 | \\_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model(\"base\").transcribe(sys.argv[-1])[\"text\"]) alan-watts-gnosticism-first-60-seconds.m4a\r\ncoder 6391 34.3 0.2 4647032 733600 pts/12 Sl+ 18:43 6:30 | \\_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model(\"base\").transcribe(sys.argv[-1])[\"text\"]) alan-watts-gnosticism-first-60-seconds.m4a\r\ncoder 6391 34.3 0.2 4647032 733600 pts/12 Sl+ 18:43 6:30 | \\_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model(\"base\").transcribe(sys.argv[-1])[\"text\"]) alan-watts-gnosticism-first-60-seconds.m4a\r\ncoder 6391 34.3 0.2 4647032 733600 pts/12 Rl+ 18:43 6:30 | \\_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model(\"base\").transcribe(sys.argv[-1])[\"text\"]) alan-watts-gnosticism-first-60-seconds.m4a\r\ncoder 6391 34.3 0.2 4647032 733600 pts/12 Rl+ 18:43 6:31 | \\_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model(\"base\").transcribe(sys.argv[-1])[\"text\"]) alan-watts-gnosticism-first-60-seconds.m4a\r\ncoder 6391 34.3 0.2 4647032 733600 pts/12 Rl+ 18:43 6:31 | \\_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model(\"base\").transcribe(sys.argv[-1])[\"text\"]) alan-watts-gnosticism-first-60-seconds.m4a\r\ncoder 6391 34.3 0.2 4647032 733600 pts/12 Sl+ 18:43 6:31 | \\_ /.pyenv/versions/3.9.13/bin/python -uc import sys, whisper; print(whisper.load_model(\"base\").transcribe(sys.argv[-1])[\"text\"]) alan-watts-gnosticism-first-60-seconds.m4a\r\n```\r\n\r\n- Some serious OOM happening here (guessing)\r\n\r\n```console\r\n$ time python -uc 'import sys, whisper; print(whisper.load_model(\"tiny.en\").transcribe(sys.argv[-1], language=\"en\")[\"text\"])' alan-watts-gnosticism.m4a\r\n/home/coder/.local/lib/python3.9/site-packages/whisper/transcribe.py:70: UserWarning: FP16 is not supported on CPU; using FP32 instead\r\n warnings.warn(\"FP16 is not supported on CPU; using FP32 instead\")\r\nKilled\r\n\r\nreal 1m21.526s\r\nuser 0m13.171s\r\nsys 0m12.903s\r\n```"
}
]
},
{
"body": "# 2022-09-23 Engineering Log",
"replies": [
{
"body": "## 2022-09-23 @pdxjohnny Engineering Log\r\n\r\n- [Architecting Alice: Alice OS](https://github.com/intel/dffml/discussions/1406#discussioncomment-3720703)\r\n - WSL kept throwing blue screens on too large downloads :( time to run something Linux based as L0\r\n - ![elmo-fire-blue-screens-for-Chaos-God](https://user-images.githubusercontent.com/5950433/192104042-385b37f4-06e1-4193-95e7-dd74c30e708a.png)"
},
{
"body": "# Architecting Alice: OS DecentrAlice\r\n\r\n> Moved to: https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0006_os_decentralice.md\r\n\r\nLet's build an Operating System!\r\n\r\n- Context\r\n - We need a base of operations from which to build on\r\n as we deploy Alice in various contexts.\r\n- Goals\r\n - We want to end up with something that can be used as a daily driver.\r\n- Actions\r\n - We are going to take userspace from Wolfi and kernel from Fedora.\r\n We'll roll in SSI service binaries to auto start on boot.\r\n- Future work\r\n - We'll see what we can do about TPM support / secure boot.\r\n- References\r\n - This tutorial is covered in `OS DecentrAlice: Rolling an OS` **TODO** Update with link to recording once made.\r\n - The resulting commit from completion of this tutorial was: **TODO** Update with link to operations added.\r\n- Feedback\r\n - Please provide feedback / thoughts for extension / improvement about this tutorial in the following discussion thread: https://github.com/intel/dffml/discussions/1414\r\n\r\nWe will verify that the OS boots under a virtualized environment.\r\n\r\nWe will then boot to an arch linux live USB, format a disk, write\r\nthe contents of our new operating system to the root partition,\r\nand install a bootloader (can we use systemd?).\r\n\r\nWe'll leverage QEMU for our virtualized environment and\r\nDockerfiles to define the OS image contents.\r\n\r\n- Arch Linux Live @ `/`\r\n - Wofli @ `/mnt`\r\n - Fedora @ `/mnt/fedora`\r\n\r\n## Base Image Dockerfile\r\n\r\n```Dockerfile\r\n# OS DecentrAlice Base Image Dockerfile\r\n# Docs: https://github.com/intel/dffml/discussions/1406#discussioncomment-3720703\r\n\r\n# Download and build the Self Soverign Identity Service\r\nFROM cgr.dev/chainguard/wolfi-base AS build-ssi-service\r\nRUN apk update && apk add --no-cache --update-cache curl go\r\n\r\nRUN curl -sfL https://github.com/TBD54566975/ssi-service/archive/refs/heads/main.tar.gz \\\r\n | tar xvz \\\r\n && cd /ssi-service-main \\\r\n && go build -tags jwx_es256k -o /ssi-service ./cmd\r\n\r\n# Download the Linux kernel and needed utils to create bootable system\r\nFROM registry.fedoraproject.org/fedora AS build-linux-kernel\r\n\r\nRUN mkdir -p /build/kernel-core-rpms \\\r\n && source /usr/lib/os-release \\\r\n && dnf -y install \\\r\n --installroot=/build/kernel-core-rpms \\\r\n --releasever=\"${VERSION_ID}\" \\\r\n kernel-core \\\r\n kernel-modules \\\r\n systemd \\\r\n systemd-networkd \\\r\n systemd-udev \\\r\n dracut \\\r\n binutils \\\r\n strace \\\r\n kmod-libs\r\n\r\n# First PATH addition\r\n# Add Fedora install PATHs to image environment\r\nRUN mkdir -p /build/kernel-core-rpms/etc \\\r\n && echo \"PATH=\\\"\\${PATH}:${PATH}:/usr/lib/dracut/\\\"\" | tee /build/kernel-core-rpms/etc/environment\r\n\r\n# Configure the OS\r\nFROM cgr.dev/chainguard/wolfi-base\r\n\r\n# Install SSI Service\r\nCOPY --from=build-ssi-service /ssi-service /usr/bin/ssi-service\r\n\r\n# Install Linux Kernel\r\n# TODO Hardlink kernel paths\r\nCOPY --from=build-linux-kernel /build/kernel-core-rpms /fedora\r\n\r\n# Second PATH addition\r\n# Add Wofli install PATHs to image environment\r\nRUN source /fedora/etc/environment \\\r\n && echo \"PATH=\\\"${PATH}\\\"\" | tee /etc/environment /etc/environment-wofli\r\n\r\n# Patch dracut because we could not find what package on Wolfi provides readlink\r\n# RUN sed -i 's/readonly TMPDIR.*/readonly TMPDIR=\"$tmpdir\"/' /freusr/bin/dracut\r\n\r\n# Run depmod to build /lib/modules/${KERNEL_VERSION}/modules.dep which is\r\n# required by dracut for efi creation.\r\nRUN chroot /fedora /usr/bin/bash -c \"depmod $(ls /fedora/lib/modules) -a\"\r\n\r\n# TODO(security) Pinning and hash validation on get-pip\r\nRUN apk update && apk add --no-cache --update-cache \\\r\n curl \\\r\n bash \\\r\n python3 \\\r\n sed \\\r\n && curl -sSL https://bootstrap.pypa.io/get-pip.py -o get-pip.py \\\r\n && python get-pip.py\r\n\r\nRUN echo 'mount /dev/sda1 /mnt/boot' | tee /fedora-dracut.sh \\\r\n && echo 'swapon /dev/sda2' | tee -a /fedora-dracut.sh \\\r\n && echo 'mkdir -p /mnt/{proc,dev,sys}' | tee -a /fedora-dracut.sh \\\r\n && echo 'mkdir -p /mnt/var/tmp' | tee -a /fedora-dracut.sh \\\r\n && echo 'mkdir -p /mnt/fedora/var/tmp' | tee -a /fedora-dracut.sh \\\r\n && echo \"cat > /mnt/fedora/run-dracut.sh <<'LOL'\" | tee -a /fedora-dracut.sh \\\r\n && echo 'export PATH=\"${PATH}:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/lib/dracut/\"' | tee -a /fedora-dracut.sh \\\r\n && echo 'export KERNEL_VERSION=\"$(ls /lib/modules)\"' | tee -a /fedora-dracut.sh \\\r\n && echo 'bash -xp /usr/bin/dracut --uefi --kver ${KERNEL_VERSION} --kernel-cmdline \"console=ttyS0 root=/dev/sda3\"' | tee -a /fedora-dracut.sh \\\r\n && echo 'LOL' | tee -a /fedora-dracut.sh \\\r\n && echo 'arch-chroot /mnt/fedora /bin/bash run-dracut.sh' | tee -a /fedora-dracut.sh \\\r\n && echo 'bootctl --esp-path=/mnt/boot install' | tee -a /fedora-dracut.sh \\\r\n && echo 'for file in $(find /mnt/fedora/boot/); do cp -v $file $(echo $file | sed -e \"s/fedora//\" -e \"s/efi\\/EFI/EFI/\"); done' | tee -a /fedora-dracut.sh\r\n\r\nRUN rm /sbin/init \\\r\n && ln -s /fedora/lib/systemd/systemd /sbin/init\r\n\r\n# Install Alice\r\n# ARG ALICE_STATE_OF_ART=0c4b8191b13465980ced3fd1ddfbea30af3d1104\r\n# RUN python3 -m pip install -U setuptools pip wheel\r\n# RUN python3 -m pip install \\\r\n# \"https://github.com/intel/dffml/archive/${ALICE_STATE_OF_ART}.zip#egg=dffml\" \\\r\n# \"https://github.com/intel/dffml/archive/${ALICE_STATE_OF_ART}.zip#egg=dffml-feature-git&subdirectory=feature/git\" \\\r\n# \"https://github.com/intel/dffml/archive/${ALICE_STATE_OF_ART}.zip#egg=shouldi&subdirectory=examples/shouldi\" \\\r\n# \"https://github.com/intel/dffml/archive/${ALICE_STATE_OF_ART}.zip#egg=dffml-config-yaml&subdirectory=configloader/yaml\" \\\r\n# \"https://github.com/intel/dffml/archive/${ALICE_STATE_OF_ART}.zip#egg=dffml-operations-innersource&subdirectory=operations/innersource\" \\\r\n# \"https://github.com/intel/dffml/archive/${ALICE_STATE_OF_ART}.zip#egg=alice&subdirectory=entities/alice\"\r\n\r\nENTRYPOINT bash\r\n```\r\n\r\n### SSI Service\r\n\r\n- TODO\r\n - [ ] User systemd socket and service for `/etc/skel` (the place copied from when using `useradd -m`)\r\n\r\n\r\n### Systemd\r\n\r\n**TODO** Currently systemd is within the fedora chroot which causes issues\r\nwith it's default library search path on load.\r\n\r\nWe could try going any of the following routes next or combination thereof.\r\n\r\n- Wrapper exec on systemd to set `LD_LIBRARY_PATH` before exec\r\n - Possibly with all libs explicitly set (`.so` files) to their location within\r\n the Fedora chroot (`/mnt/fedora` currently).\r\n- Separate Partitions\r\n - Chroot on getty / docker / k3s start (once we get there)\r\n - We haven't messed with docker / k3s yet (something to run containers from Wofli)\r\n- Overlayfs?\r\n - Not sure if this might be helpful here\r\n - Something something systemd target / service to mount Wolfi over Fedora and then chroot?\r\n\r\nSTATE_OF_THE_ART: Error bellow for systemd failure to load `.so`'s\r\n\r\n```\r\n Starting initrd-switch-root.service - Switch Root...\r\n[ 7.926443] systemd-journald[229]: Received SIGTERM from PID 1 (systemd).\r\n[ 8.036984] Kernel panic - not syncing: Attempted to kill init! exitcode=0x00007f00\r\n[ 8.037936] CPU: 0 PID: 1 Comm: init Not tainted 5.19.10-200.fc36.x86_64 #1\r\n[/ s b 8in./0i37n93i6t]: Hearrdrwaore name: QEMU Standard PC (i440FX + PIIX, 1996), BIOS 0.0.0 02/06/2015\r\n[ 8.037936] Call Trace:\r\n...\r\n[ 8.131416] </TASK>\r\nr while loading shared libraries: libsystemd-shared-250.so: cannot open shared object file: No such file or directory\r\n```\r\n\r\n## Installation in VM\r\n\r\n- Using DigitalOcean Fedora host with QEMU installed (`dnf -y install qemu`)\r\n - First boot and install via arch PXE\r\n - Mount root partition\r\n - `# mount /dev/sda3 /mnt`\r\n - Install bootloader\r\n - `# bash -x /mnt/fedora/run-dracut.sh`\r\n - Then reboot without PXE to boot into system\r\n- TODO Piggy Back off arch linux install guide\r\n - https://wiki.archlinux.org/title/Installation_guide\r\n\r\n```bash\r\n#!/usr/bin/env bash\r\nset -xeuo pipefail\r\n\r\n# Virtual machine disk image where virtual machine filesystem is stored\r\nVM_DISK=${VM_DISK:-\"${HOME}/vm/image.qcow2\"}\r\n\r\n# Block device we use as an intermediary to mount the guest filesystem from host\r\nVM_DEV=${VM_DEV:-\"/dev/nbd0\"}\r\n\r\n# The directory where we mount the guest filesystem on the host for access and\r\n# modification when not in use by the guest\r\nCHROOT=${CHROOT:-\"${HOME}/vm/decentralice-chroot\"}\r\n\r\n# Extract container image to chroot\r\nIMAGE=${IMAGE:-\"localhost/c-distroliess:latest\"};\r\n\r\ncontainer=$(podman run --rm -d --entrypoint tail \"${IMAGE}\" -F /dev/null);\r\ntrap \"podman kill ${container}\" EXIT\r\n\r\n# Linux kernel command line\r\nCMDLINE=${CMDLINE:-\"console=ttyS0 root=/dev/sda3 rw resume=/dev/sda2 init=/usr/bin/init.sh\"}\r\n\r\n# Location of qemu binary to use\r\nQEMU=${QEMU:-\"qemu-system-x86_64\"}\r\n\r\n# Load the network block device kernel module\r\nsudo modprobe nbd max_part=8\r\n\r\n# Unmount the virtual disk image if it is currently mounted\r\nsudo umount -R \"${CHROOT}\" || echo \"Image was not mounted at ${CHROOT}\"\r\n# Disconnect the network block device\r\nsudo qemu-nbd --disconnect \"${VM_DEV}\" || echo \"Image was not connected as nbd\"\r\n\r\nmount_image() {\r\n sudo qemu-nbd --connect=\"${VM_DEV}\" \"${VM_DISK}\"\r\n sudo mount \"${VM_DEV}p3\" \"${CHROOT}\"\r\n sudo mount \"${VM_DEV}p1\" \"${CHROOT}/boot\"\r\n}\r\n\r\nunmount_image() {\r\n sudo sync\r\n sudo umount -R \"${CHROOT}\"\r\n sudo qemu-nbd --disconnect \"${VM_DEV}\"\r\n}\r\n\r\n# Check if the block device we are going to use to mount the virtual disk image\r\n# already exists\r\nif [ -b \"${VM_DEV}\" ]; then\r\n echo \"VM_DEV already exists: ${VM_DEV}\" >&2\r\n # exit 1\r\nfi\r\n\r\n# Create the virtual disk image and populate it if it does not exist\r\nif [ ! -f \"${VM_DISK}\" ]; then\r\n mkdir -p \"${CHROOT}\"\r\n mkdir -p \"$(dirname ${VM_DISK})\"\r\n\r\n # Create the virtual disk image\r\n qemu-img create -f qcow2 \"${VM_DISK}\" 20G\r\n\r\n # Use the QEMU guest utils network block device utility to mount the virtual\r\n # disk image as the $VM_DEV device\r\n sudo qemu-nbd --connect=\"${VM_DEV}\" \"${VM_DISK}\"\r\n # Partition the block device\r\n sudo parted \"${VM_DEV}\" << 'EOF'\r\nmklabel gpt\r\nmkpart primary fat32 1MiB 261MiB\r\nset 1 esp on\r\nmkpart primary linux-swap 261MiB 10491MiB\r\nmkpart primary ext4 10491MiB 100%\r\nEOF\r\n # EFI partition\r\n sudo mkfs.fat -F32 \"${VM_DEV}p1\"\r\n # swap space\r\n sudo mkswap \"${VM_DEV}p2\"\r\n # Linux root partition\r\n sudo mkfs.ext4 \"${VM_DEV}p3\"\r\n sudo mount \"${VM_DEV}p3\" \"${CHROOT}\"\r\n # Boot partiion\r\n sudo mkdir \"${CHROOT}/boot\"\r\n sudo mount \"${VM_DEV}p1\" \"${CHROOT}/boot\"\r\n\r\n # Image to download\r\n podman cp \"${container}:/\" \"${CHROOT}\"\r\n\r\n # Unmount the virtual disk image so the virtual machine can use it\r\n unmount_image\r\nfi\r\n\r\n# Mount the guest file system on the host when we exit the guest\r\ntrap mount_image EXIT\r\n\r\nif [[ ! -f \"$( echo ipxe*.efi)\" ]]; then\r\n curl -sfLO https://archlinux.org/static/netboot/ipxe-arch.16e24bec1a7c.efi\r\nfi\r\n\r\n# Only add -kernel for first install\r\n# -kernel ipxe*.efi \\\r\n\r\n\"${QEMU}\" \\\r\n -smp cpus=2 \\\r\n -m 4096M \\\r\n -enable-kvm \\\r\n -nographic \\\r\n -cpu host \\\r\n -drive file=\"${VM_DISK}\",index=0,media=disk,format=qcow2 \\\r\n -bios /usr/share/edk2/ovmf/OVMF_CODE.fd $@\r\n```\r\n\r\n#### Disk Partitioning\r\n\r\n`decentralice.sh` creates a 20 GB virtual disk in QCOW2 format\r\nand formats partitions according to the following example UEFI\r\nrecommendations.\r\n\r\n- References\r\n - https://wiki.archlinux.org/title/Installation_guide#Boot_loader\r\n - https://wiki.archlinux.org/title/Installation_guide#Example_layouts\r\n\r\n#### Netboot to Live Install Media\r\n\r\nWe download the pxe netboot image and use it to boot to an\r\nArch Linux live image which is usually used for installing\r\nArch Linux, but there is no reason we can't use it to install\r\nAliceOS.\r\n\r\nChoose a contry and mirror then modify \r\n\r\n- References\r\n - https://archlinux.org/releng/netboot/\r\n\r\n```console\r\n$ ssh -t -i ~/.ssh/nahdig -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o PasswordAuthentication=no $USER@147.182.254.77 sudo rm -f /root/vm/image.qcow2\r\nWarning: Permanently added '147.182.254.77' (ECDSA) to the list of known hosts.\r\nConnection to 147.182.254.77 closed.\r\n$ python -m asciinema rec --idle-time-limit 0.5 --title \"$(date +%4Y-%m-%d-%H-%M-%ss)\" --command \"ssh -t -i ~/.ssh/nahdig -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o PasswordAuthentication=no $USER@147.182.254.77 sudo bash decentralice.sh -kernel ipxe*.efi\" >(xz --stdout - > \"$HOME/asciinema/rec-$(hostname)-$(date +%4Y-%m-%d-%H-%M-%ss).json.xz\")\r\n```\r\n\r\n#### Mount Partitions from Live Install Media `root` Shell\r\n\r\n```console\r\nBoot options: ip=dhcp net.ifnames=0 BOOTIF=01-52:54:00:12:34:56 console=ttyS0\r\n\r\n Arch Linux Netboot\r\n\r\n Settings\r\n Architecture: x86_64\r\n Release: 2022.09.03\r\n Mirror: http://mirrors.cat.pdx.edu/archlinux/\r\n Boot options: ip=dhcp net.ifnames=0 BOOTIF=01-52:54:00:12:34:56 console=tt\r\n\r\n Boot Arch Linux\r\n Drop to iPXE shell\r\n Reboot\r\n Exit iPXE\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\nBooting Arch Linux x86_64 2022.09.03 from http://mirrors.cat.pdx.edu/archlinux/\r\n\r\nhttp://mirrors.cat.pdx.edu/archlinux/iso/2022.09.03/arch/boot/x86_64/vmlinuz-linux... ok\r\nhttp://mirrors.cat.pdx.edu/archlinux/iso/2022.09.03/arch/boot/x86_64/vmlinuz-linux.ipxe.sig... ok\r\nhttp://mirrors.cat.pdx.edu/archlinux/iso/2022.09.03/arch/boot/amd-ucode.img... ok\r\nhttp://mirrors.cat.pdx.edu/archlinux/iso/2022.09.03/arch/boot/amd-ucode.img.ipxe.sig... ok\r\nhttp://mirrors.cat.pdx.edu/archlinux/iso/2022.09.03/arch/boot/intel-ucode.img... ok\r\nhttp://mirrors.cat.pdx.edu/archlinux/iso/2022.09.03/arch/boot/intel-ucode.img.ipxe.sig... ok\r\nhttp://mirrors.cat.pdx.edu/archlinux/iso/2022.09.03/arch/boot/x86_64/initramfs-linux.img... ok\r\nhttp://mirrors.cat.pdx.edu/archlinux/iso/2022.09.03/arch/boot/x86_64/initramfs-linux.img.ipxe.sig... ok\r\n:: running early hook [udev]\r\nStarting version 251.4-1-arch\r\n:: running early hook [archiso_pxe_nbd]\r\n:: running hook [udev]\r\n:: Triggering uevents...\r\n:: running hook [memdisk]\r\n:: running hook [archiso]\r\n:: running hook [archiso_loop_mnt]\r\n:: running hook [archiso_pxe_common]\r\nIP-Config: eth0 hardware address 52:54:00:12:34:56 mtu 1500 DHCP\r\nIP-Config: eth0 guessed broadcast address 10.0.2.255\r\nIP-Config: eth0 complete (from 10.0.2.2):\r\n address: 10.0.2.15 broadcast: 10.0.2.255 netmask: 255.255.255.0\r\n gateway: 10.0.2.2 dns0 : 10.0.2.3 dns1 : 0.0.0.0\r\n rootserver: 10.0.2.2 rootpath:\r\n filename :\r\n:: running hook [archiso_pxe_nbd]\r\n:: running hook [archiso_pxe_http]\r\n:: running hook [archiso_pxe_nfs]\r\n:: Mounting /run/archiso/httpspace (tmpfs) filesystem, size='75%'\r\n:: Downloading 'http://mirrors.cat.pdx.edu/archlinux/iso/2022.09.03/arch/x86_64/airootfs.sfs'\r\n % Total % Received % Xferd Average Speed Time Time Time Current\r\n Dload Upload Total Spent Left Speed\r\n100 683M 100 683M 0 0 52.3M 0 0:00:13 0:00:13 --:--:-- 65.9M\r\n:: Downloading 'http://mirrors.cat.pdx.edu/archlinux/iso/2022.09.03/arch/x86_64/airootfs.sfs.sig'\r\n % Total % Received % Xferd Average Speed Time Time Time Current\r\n Dload Upload Total Spent Left Speed\r\n100 471 100 471 0 0 7009 0 --:--:-- --:--:-- --:--:-- 7136\r\n:: Signature verification requested, please wait...\r\n[GNUPG:] GOODSIG 044ABFB932C36814 Arch Linux Release Engineering (Ephemeral Signing Key) <arch-releng@lists.archlinux.org>\r\nSignature is OK, continue booting.\r\n:: Mounting /run/archiso/copytoram (tmpfs) filesystem, size=75%\r\n:: Mounting /run/archiso/cowspace (tmpfs) filesystem, size=256M...\r\n:: Copying rootfs image to RAM...\r\ndone.\r\n:: Mounting '/dev/loop0' to '/run/archiso/airootfs'\r\n:: Device '/dev/loop0' mounted successfully.\r\n:: running late hook [archiso_pxe_common]\r\n:: running cleanup hook [udev]\r\n\r\nWelcome to Arch Linux!\r\n\r\n[ 41.600639] I/O error, dev fd0, sector 0 op 0x0:(READ) flags 0x0 phys_seg 1 prio class 0\r\n[ OK ] Created slice Slice /system/getty.\r\n[ OK ] Created slice Slice /system/modprobe.\r\n[ OK ] Created slice Slice /system/serial-getty.\r\n[ OK ] Created slice User and Session Slice.\r\n[ OK ] Started Dispatch Password \u2026ts to Console Directory Watch.\r\n[ OK ] Started Forward Password R\u2026uests to Wall Directory Watch.\r\n[ OK ] Set up automount Arbitrary\u2026s File System Automount Point.\r\n[ OK ] Reached target Local Encrypted Volumes.\r\n[ OK ] Reached target Local Integrity Protected Volumes.\r\n[ OK ] Reached target Path Units.\r\n...\r\n[ OK ] Started Getty on tty1.\r\n[ OK ] Started Serial Getty on ttyS0.\r\n[ OK ] Reached target Login Prompts.\r\n\r\nArch Linux 5.19.6-arch1-1 (ttyS0)\r\n\r\narchiso login: root\r\nTo install Arch Linux follow the installation guide:\r\nhttps://wiki.archlinux.org/title/Installation_guide\r\n\r\nFor Wi-Fi, authenticate to the wireless network using the iwctl utility.\r\nFor mobile broadband (WWAN) modems, connect with the mmcli utility.\r\nEthernet, WLAN and WWAN interfaces using DHCP should work automatically.\r\n\r\nAfter connecting to the internet, the installation guide can be accessed\r\nvia the convenience script Installation_guide.\r\n\r\n\r\nLast login: Sun Sep 25 23:55:20 on tty1\r\nroot@archiso ~ # mount /dev/sda3 /mnt\r\nroot@archiso ~ # bash -x /mnt/fedora-dracut.sh\r\n```\r\n\r\n- Now without PXE boot\r\n - Currently systemd takes the \r\n\r\n```console\r\n$ python -m asciinema rec --idle-time-limit 0.5 --title \"$(date +%4Y-%m-%d-%H-%M-%ss)\" --command \"ssh -t -i ~/.ssh/nahdig -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o PasswordAuthentication=no $USER@147.182.254.77 sudo bash decentralice.sh\" >(xz --stdout - > \"$HOME/asciinema/rec-$(hostname)-$(date +%4Y-%m-%d-%H-%M-%ss).json.xz\")\r\n+ VM_DISK=/root/vm/image.qcow2\r\n+ VM_DEV=/dev/nbd0\r\n+ CHROOT=/root/vm/decentralice-chroot\r\n+ IMAGE=localhost/c-distroliess:latest\r\n++ podman run --rm -d --entrypoint tail localhost/c-distroliess:latest -F /dev/null\r\n+ container=1b79597e28cbc714043992a46d0498bd31a449c773784e0fab4629ee11244ce1\r\n+ trap 'podman kill 1b79597e28cbc714043992a46d0498bd31a449c773784e0fab4629ee11244ce1' EXIT\r\n+ CMDLINE='console=ttyS0 root=/dev/sda3 rw resume=/dev/sda2 init=/usr/bin/init.sh'\r\n+ QEMU=qemu-system-x86_64\r\n+ sudo modprobe nbd max_part=8\r\n+ sudo umount -R /root/vm/decentralice-chroot\r\n+ sudo qemu-nbd --disconnect /dev/nbd0\r\n/dev/nbd0 disconnected\r\n+ '[' -b /dev/nbd0 ']'\r\n+ echo 'VM_DEV already exists: /dev/nbd0'\r\nVM_DEV already exists: /dev/nbd0\r\n+ '[' '!' -f /root/vm/image.qcow2 ']'\r\n+ trap mount_image EXIT\r\n++ echo ipxe-arch.16e24bec1a7c.efi\r\n+ [[ ! -f ipxe-arch.16e24bec1a7c.efi ]]\r\n+ qemu-system-x86_64 -smp cpus=2 -m 4096M -enable-kvm -nographic -cpu host -drive file=/root/vm/image.qcow2,index=0,media=disk,format=qcow2 -bios /usr/shar\r\ne/edk2/ovmf/OVMF_CODE.fd\r\nBdsDxe: loading Boot0001 \"Linux Boot Manager\" from HD(1,GPT,5ED5E31E-F9DF-4168-B087-18AB1EF33E24,0x800,0x82000)/\\EFI\\systemd\\systemd-bootx64.efi\r\nBdsDxe: starting Boot0001 \"Linux Boot Manager\" from HD(1,GPT,5ED5E31E-F9DF-4168-B087-18AB1EF33E24,0x800,0x82000)/\\EFI\\systemd\\systemd-bootx64.efi\r\nEFI stub: Loaded initrd from LINUX_EFI_INITRD_MEDIA_GUID device path\r\n[ 0.000000] Linux version 5.19.10-200.fc36.x86_64 (mockbuild@bkernel01.iad2.fedoraproject.org) (gcc (GCC) 12.2.1 20220819 (Red Hat 12.2.1-2), GNU ld ver\r\nsion 2.37-36.fc36) #1 SMP PREEMPT_DYNAMIC Tue Sep 20 15:15:53 UTC 2022\r\n[ 0.000000] Command line: console=ttyS0 root=/dev/sda3\r\n[ 0.000000] x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'\r\n[ 0.000000] x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'\r\n[ 0.000000] x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'\r\n[ 0.000000] x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256\r\n[ 0.000000] x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format.\r\n[ 0.000000] signal: max sigframe size: 1776\r\n[ 0.000000] BIOS-provided physical RAM map:\r\n...\r\n[ 4.505931] systemd[1]: dracut-pre-udev.service - dracut pre-udev hook was skipped because all trigger condition checks failed.\r\n[ 4.511214] audit: type=1130 audit(1664171381.024:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm=\"systemd\" exe=\"/usr/lib/systemd/systemd\" hostname=? addr=? terminal=? res=success'\r\n[ 4.521203] systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev...\r\n Starting systemd-tmpfiles-\u2026ate Static Device Nodes in /dev...\r\n[ 4.530842] systemd[1]: Started systemd-journald.service - Journal Service.\r\n[ OK ] Started systemd-journald.service - Journal Service.\r\n Starting syste[ 4.543614] audit: type=1130 audit(1664171381.072:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm=\"systemd\" exe=\"/usr/lib/systemd/systemd\" hostname=? addr=? terminal=? res=success'\r\nmd-tmpfiles-\u2026 Volatile Files and Directories...\r\n[ OK ] Finished systemd-tmpfiles-\u2026reate Static Device Nodes in /dev.\r\n Starting systemd-udevd.ser\u2026ger for Device Events and Files..[ 4.570653] audit: type=1130 audit(1664171381.095:6): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm=\"systemd\" exe=\"/usr/lib/systemd/systemd\" hostname=? addr=? terminal=? res=success'\r\n.[ 4.580930] audit: type=1334 audit(1664171381.097:7): prog-id=6 op=LOAD\r\n\r\n[ 4.596257] audit: type=1334 audit(1664171381.097:8): prog-id=7 op=LOAD\r\n[ 4.596303] audit: type=1334 audit(1664171381.097:9): prog-id=8 op=LOAD\r\n[ OK ] Finished systemd-tmpfiles-\u2026te Volatile Files and Directories.\r\n[ 4.614382] audit: type=1130 audit(1664171381.146:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm=\"systemd\" exe=\"/usr/lib/systemd/systemd\" hostname=? addr=? terminal=? res=success'\r\n[ OK ] Started systemd-udevd.serv\u2026nager for Device Events and Files.\r\n Starting systemd-udev-trig\u2026[0m - Coldplug All udev Devices...\r\n[ OK ] Finished systemd-udev-trig\u2026e - Coldplug All udev Devices.\r\n[ OK ] Reached target sysinit.target - System Initialization.\r\n[ OK ] Reached target basic.target - Basic System.\r\n[ OK ] Reached target remote-fs-p\u2026eparation for Remote File Systems.\r\n[ OK ] Reached target remote-fs.target - Remote File Systems.\r\n[ OK ] Found device dev-sda3.device - QEMU_HARDDISK primary.\r\n[ OK ] Reached target initrd-root\u2026e.target - Initrd Root Device.\r\n Starting systemd-fsck-root\u2026 File System Check on /dev/sda3...\r\n[ OK ] Finished systemd-fsck-root\u2026 - File System Check on /dev/sda3.\r\n Mounting sysroot.mount - /sysroot...\r\n[ 5.543281] EXT4-fs (sda3): mounted filesystem with ordered data mode. Quota mode: none.\r\n[ OK ] Mounted sysroot.mount - /sysroot.\r\n[ OK ] Reached target initrd-root\u2026get - Initrd Root File System.\r\n Starting initrd-parse-etc.\u2026onfiguration from the Real Root...\r\n[ OK ] Finished initrd-parse-etc.\u2026 Configuration from the Real Root.\r\n[ OK ] Reached target initrd-fs.target - Initrd File Systems.\r\n[ OK ] Reached target initrd.target - Initrd Default Target.\r\n Starting dracut-pre-pivot.\u2026acut pre-pivot and cleanup hook...\r\n[ OK ] Finished dracut-pre-pivot.\u2026dracut pre-pivot and cleanup hook.\r\n Starting initrd-cleanup.se\u2026ng Up and Shutting Down Daemons...\r\n[ OK ] Stopped target timers.target - Timer Units.\r\n[ OK ] Stopped dracut-pre-pivot.s\u2026dracut pre-pivot and cleanup hook.\r\n[ OK ] Stopped target initrd.target - Initrd Default Target.\r\n[ OK ] Stopped target basic.target - Basic System.\r\n[ OK ] Stopped target initrd-root\u2026e.target - Initrd Root Device.\r\n[ OK ] Stopped target initrd-usr-\u2026get - Initrd /usr File System.\r\n[ OK ] Stopped target paths.target - Path Units.\r\n[ OK ] Stopped systemd-ask-passwo\u2026quests to Console Directory Watch.\r\n[ OK ] Stopped target remote-fs.target - Remote File Systems.\r\n[ OK ] Stopped target remote-fs-p\u2026eparation for Remote File Systems.\r\n[ OK ] Stopped target slices.target - Slice Units.\r\n[ OK ] Stopped target sockets.target - Socket Units.\r\n[ OK ] Stopped target sysinit.target - System Initialization.\r\n[ OK ] Stopped target swap.target - Swaps.\r\n[ OK ] Stopped systemd-sysctl.service - Apply Kernel Variables.\r\n[ OK ] Stopped systemd-tmpfiles-s\u2026te Volatile Files and Directories.\r\n[ OK ] Stopped target local-fs.target - Local File Systems.\r\n[ OK ] Stopped systemd-udev-trigg\u2026e - Coldplug All udev Devices.\r\n Stopping systemd-udevd.ser\u2026ger for Device Events and Files...\r\n[ OK ] Stopped systemd-vconsole-s\u2026rvice - Setup Virtual Console.\r\n[ OK ] Finished initrd-cleanup.se\u2026ning Up and Shutting Down Daemons.\r\n[ OK ] Stopped systemd-udevd.serv\u2026nager for Device Events and Files.\r\n[ OK ] Closed systemd-udevd-contr\u2026.socket - udev Control Socket.\r\n[ OK ] Closed systemd-udevd-kernel.socket - udev Kernel Socket.\r\n Starting initrd-udevadm-cl\u2026ice - Cleanup udev Database...\r\n[ OK ] Stopped systemd-tmpfiles-s\u2026reate Static Device Nodes in /dev.\r\n[ OK ] Stopped kmod-static-nodes.\u2026reate List of Static Device Nodes.\r\n[ OK ] Finished initrd-udevadm-cl\u2026rvice - Cleanup udev Database.\r\n[ OK ] Reached target initrd-switch-root.target - Switch Root.\r\n Starting initrd-switch-root.service - Switch Root...\r\n[ 7.926443] systemd-journald[229]: Received SIGTERM from PID 1 (systemd).\r\n[ 8.036984] Kernel panic - not syncing: Attempted to kill init! exitcode=0x00007f00\r\n[ 8.037936] CPU: 0 PID: 1 Comm: init Not tainted 5.19.10-200.fc36.x86_64 #1\r\n[/ s b 8in./0i37n93i6t]: Hearrdrwaore name: QEMU Standard PC (i440FX + PIIX, 1996), BIOS 0.0.0 02/06/2015\r\n[ 8.037936] Call Trace:\r\n[ 8.037936] <TASK>\r\n[ 8.037936] dump_stack_lvl+0x44/0x5c\r\n[ 8.037936] panic+0xfb/0x2b1\r\n[ 8.037936] do_exit.cold+0x15/0x15\r\n[ 8.037936] do_group_exit+0x2d/0x90\r\n[ 8.037936] __x64_sys_exit_group+0x14/0x20\r\n[ 8.037936] do_syscall_64+0x5b/0x80\r\n[ 8.037936] ? do_syscall_64+0x67/0x80\r\n[ 8.037936] entry_SYSCALL_64_after_hwframe+0x63/0xcd\r\n[ 8.037936] RIP: 0033:0x7f9b61282911\r\n[ 8.037936] Code: f7 d8 89 01 48 83 c8 ff c3 be e7 00 00 00 ba 3c 00 00 00 eb 11 0f 1f 40 00 89 d0 0f 05 48 3d 00 f0 ff ff 77 1c f4 89 f0 0f 05 <48> 3d 00 f0 ff ff 76 e7 f7 d8 89 05 7f 29 01 00 eb dd 0f 1f 44 00\r\n[ 8.037936] RSP: 002b:00007ffd45b6dc78 EFLAGS: 00000246 ORIG_RAX: 00000000000000e7\r\n[ 8.037936] RAX: ffffffffffffffda RBX: 00007f9b6128caf8 RCX: 00007f9b61282911\r\n[ 8.037936] RDX: 000000000000003c RSI: 00000000000000e7 RDI: 000000000000007f\r\n[ 8.037936] RBP: 00007f9b6126017f R08: 00007ffd45b6dc88 R09: 000000006128a000\r\n[ 8.037936] R10: 0000000000000020 R11: 0000000000000246 R12: 0000000000000002\r\n[ 8.129077] R13: 0000000000000001 R14: 00007f9b612601a0 R15: 0000000000000000\r\n[ 8.131416] </TASK>\r\nr while loading shared libraries: libsystemd-shared-250.so: cannot open shared object file: No such file or directory\r\n[ 8.131416] Kernel Offset: 0x5000000 from 0xffffffff81000000 (relocation range: 0xffffffff80000000-0xffffffffbfffffff)\r\n[ 8.131416] ---[ end Kernel panic - not syncing: Attempted to kill init! exitcode=0x00007f00 ]---\r\n\r\n\r\n<Ctrl-a x>\r\n\r\nQEMU: Terminated\r\n```\r\n\r\n- TODO\r\n - `--fstab /etc/fstab`?\r\n - Not sure if we need this yet but saving here until dracut we get `EXIT_SUCCESS`\r\n - Add custom bootloader image\r\n - slice image from alice unbirthday gif-2-cli gif and convert to bitmap\r\n - References\r\n - https://man7.org/linux/man-pages/man8/dracut.8.html\r\n - > `--uefi-splash-image <FILE>`\r\n > - Specifies the UEFI stub loader\u2019s splash image. Requires\r\n > bitmap (.bmp) image format.\r\n\r\n### Alice\r\n\r\nInstall Alice!\r\n\r\n## Misc.\r\n\r\n- TODO\r\n - [ ] Updates for fedora packages (aka kernel) will need to be handled.\r\n - We might just re-roll and pull only the layers with kernel stuff? TBD\r\n - [ ] motd?\r\n- References\r\n - Chainguard\r\n - https://edu.chainguard.dev/chainguard/chainguard-images/how-to-use-chainguard-images/\r\n - https://edu.chainguard.dev/open-source/melange/getting-started-with-melange/\r\n - We should use melange and apko and setup a secure factory to build images.\r\n - Images\r\n - https://dnf-plugins-core.readthedocs.io/en/latest/download.html\r\n - https://github.com/srossross/rpmfile\r\n - QEMU\r\n - https://pdxjohnny.github.io/linux-kernel/\r\n - https://pdxjohnny.github.io/qemu/\r\n - https://archlinux.org/releng/netboot/\r\n - https://gist.github.com/pdxjohnny/6063d1893c292d1ac0024fb14d1e627d\r\n - Install Guide\r\n - https://wiki.archlinux.org/title/Installation_guide\r\n - https://archlinux.org/releng/netboot/\r\n - https://wiki.archlinux.org/title/Installation_guide#Boot_loader\r\n - https://wiki.archlinux.org/title/Installation_guide#Example_layouts\r\n - Bootloader\r\n - https://man.archlinux.org/man/bootctl.1\r\n - `root@archiso ~ # bootctl --esp-path=/mnt/boot install`\r\n - https://systemd.io/AUTOMATIC_BOOT_ASSESSMENT/\r\n - Type #2 EFI Unified Kernel Images\r\n - https://systemd.io/BOOT_LOADER_SPECIFICATION/\r\n - https://wiki.archlinux.org/title/Installation_guide#Boot_loader\r\n - https://github.com/nwildner/dracut-uefi-simple\r\n - sysadmin\r\n - https://github.com/aurae-runtime/auraed/tree/main/hack\r\n - https://github.com/aurae-runtime/auraed/blob/main/hack/initramfs/mk-initramfs\r\n - https://gist.github.com/pdxjohnny/a0dc3a58b4651dc3761bee65a198a80d#file-run-vm-sh-L125-L141\r\n - ssi-service\r\n - https://github.com/TBD54566975/ssi-service/pull/111\r\n - https://edu.chainguard.dev/open-source/melange/getting-started-with-melange/\r\n - For packaging\r\n - python\r\n - https://github.com/pypa/get-pip\r\n - TPM\r\n - https://systemd.network/linuxx64.efi.stub.html#TPM2%20PCR%20Notes\r\n - Secure Boot\r\n - https://fedoraproject.org/wiki/Secureboot\r\n - https://github.com/rhboot/pesign\r\n - https://github.com/rhboot/shim"
}
]
},
{
"body": "# 2022-09-24 Engineering Log\r\n\r\n- TODO\r\n - [ ] @yukster to investigate creation of meetup\r\n - Possible action items for meetup group\r\n - Get folks together to talk about lasting solutions to technical debt (rather than revolving door reimplementation)\r\n - Increasing awareness of technical debt incurred due to various business and architectural decisions.",
"replies": [
{
"body": "## 2022-09-24 @pdxjohnny Engineering Log\r\n\r\n- There are an infinite number of realities. We experience a subset in series when within the biological form. Time, time is the critical differentiator between this state of consciousness and others. The other states happen all at once, all the time. For whatever reason, if you find yourself in this reality, this one we call life. Know that you\u2019ll only be here for a time. You may come back, but fundamentally, this life is your time.\r\n- [Architecting Alice: OS DecentrAlice](https://github.com/intel/dffml/discussions/1406#discussioncomment-3720703)\r\n\r\n---\r\n\r\n\r\n```bash\r\nps\r\nps faux\r\nll\r\nfind\r\nfind /usr/\r\napk search linux\r\napk search kernel\r\napk search systemd\r\napk search system\r\napk search go\r\napk add go\r\ngo install github.com/magefile/mage\r\ngo install github.com/magefile/mage@v1.14.0\r\napk add git\r\ngit clone https://github.com/TBD54566975/ssi-service\r\ncd ssi-service/\r\nmage build\r\npwd\r\ngo install github.com/magefile/mage\r\nmage build\r\nenv\r\ngo install -h github.com/magefile/mage\r\ngo install -v github.com/magefile/mage\r\ngo install -vvvv github.com/magefile/mage\r\ngo install --debug github.com/magefile/mage\r\ngo install -debug github.com/magefile/mage\r\ngo install --help\r\ngo help install\r\nll ~/go/bin/\r\nls -lAF ~/go/bin/\r\nexport PATH=$PATH:$HOME/go/bin\r\nalias herstory=history\r\nherstory -a\r\ncat ~/.bash_history\r\nmage build\r\nfind .\r\nfile $(find .)\r\napk add file\r\nfile $(find .)\r\nfile $(find .) | grep bin\r\nfile $(find .) | grep -i bin\r\nfile $(find .) | grep -i exe\r\nfile $(find .) | grep -i EFI\r\nfile $(find .)\r\nll\r\nls -lAF\r\ncat magefile.go\r\nls -lAF\r\nls build/\r\nls -lAF cmd/\r\nfile $(find .) | grep -v ssi\r\nfile $(find .) | grep ssi\r\ngit grep go\\ bulid\r\ngit grep bulid\r\ngit grep bulid\r\nl\r\npwd\r\ngrep -rn build .\r\ncat build/Dockerfile\r\ngo build -tags jwx_es256k -o /docker-ssi-service ./cmd\r\nherstory -a\r\nll\r\nls -lAF\r\nls -lAF cmd/\r\n/docker-ssi-service\r\ngo build -tags jwx_es256k netgo -o /docker-ssi-service ./cmd\r\ngo build -tags jwx_es256k -tags netgo -o /docker-ssi-service ./cmd\r\nfile /docker-ssi-service\r\nlld /docker-ssi-service\r\napk add lld\r\napk add build-essential\r\napk add gcc\r\napk add binutils\r\napk add coreutils\r\nlld /docker-ssi-service\r\npwd\r\ncd\r\nrm -rf ssi-service\r\ncurl -sfL https://github.com/TBD54566975/ssi-service/archive/refs/heads/main.tar.gz | tar xvz\r\napk add curl\r\ncurl -sfL https://github.com/TBD54566975/ssi-service/archive/refs/heads/main.tar.gz | tar xvz\r\nherstory -a\r\n```"
}
]
},
{
"body": "# 2022-09-25 Engineering Log",
"replies": [
{
"body": "## 2022-09-25 @pdxjohnny Engineering Log\r\n\r\n- Architecting Alice: COPY Linux Kernel\r\n- [Architecting Alice: OS DecentrAlice](https://github.com/intel/dffml/discussions/1406#discussioncomment-3720703)\r\n\r\n```console\r\n$ cat > fedora.sh <<'EOF'\r\nmount /dev/sda3 /mnt\r\nmount /dev/sda1 /mnt/boot\r\nswapon /dev/sda2\r\nmkdir -p /mnt/{proc,dev,sys}\r\nmkdir -p /mnt/var/tmp\r\nmkdir -p /mnt/fedora/var/tmp\r\n\r\ncat > /mnt/run-dracut.sh <<'LOL'\r\nexport PATH=\"${PATH}:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/lib/dracut/\"\r\nexport KERNEL_VERSION=\"$(ls /lib/modules)\"\r\nbash -xp /usr/bin/dracut --uefi --kver ${KERNEL_VERSION} --kernel-cmdline \"console=ttyS0 root=/dev/sda3\"\r\nLOL\r\n\r\narch-chroot /mnt/fedora /bin/bash run-dracut.sh\r\nEOF\r\n$ bash fedora.sh\r\n...\r\n+ dinfo 'Executing: /usr/bin/dracut --uefi --kver 5.19.10-200.fc36.x86_64 --kernel-cmdline console=ttyS0'\r\n+ set +x\r\nbash-5.1# echo $?\r\n0\r\nbash-5.1# lsblk\r\nNAME MAJ:MIN RM SIZE RO TYPE MOUNTPOINTS\r\nfd0 2:0 1 4K 0 disk\r\nloop0 7:0 0 683.2M 1 loop\r\nsda 8:0 0 20G 0 disk\r\n\u251c\u2500sda1 8:1 0 260M 0 part\r\n\u251c\u2500sda2 8:2 0 10G 0 part [SWAP]\r\n\u2514\u2500sda3 8:3 0 9.8G 0 part\r\nsr0 11:0 1 1024M 0 rom\r\nbash-5.1# find /boot/\r\n/boot/\r\n/boot/System.map-5.19.10-200.fc36.x86_64\r\n/boot/.vmlinuz-5.19.10-200.fc36.x86_64.hmac\r\n/boot/vmlinuz-5.19.10-200.fc36.x86_64\r\n/boot/symvers-5.19.10-200.fc36.x86_64.gz\r\n/boot/efi\r\n/boot/efi/EFI\r\n/boot/efi/EFI/fedora\r\n/boot/efi/EFI/Linux\r\n/boot/efi/EFI/Linux/linux-5.19.10-200.fc36.x86_64-d1a1c3d381b9405ab46417e3535ef1be.efi\r\n/boot/grub2\r\n/boot/initramfs-5.19.10-200.fc36.x86_64.img\r\n/boot/loader\r\n/boot/loader/entries\r\n/boot/loader/entries/d1a1c3d381b9405ab46417e3535ef1be-5.19.10-200.fc36.x86_64.conf\r\n/boot/config-5.19.10-200.fc36.x86_64\r\nbash-5.1#\r\nexit\r\n[root@archiso ~]# bash fedora.shc\r\n[root@archiso ~]# ll /mnt/boot/\r\nbash: ll: command not found\r\n[root@archiso ~]# find !$\r\nfind /mnt/boot/\r\n/mnt/boot/\r\n/mnt/boot/NvVars\r\n[root@archiso ~]# bootctl --esp-path=/mnt/boot install\r\nCreated \"/mnt/boot/EFI\".\r\nCreated \"/mnt/boot/EFI/systemd\".\r\nCreated \"/mnt/boot/EFI/BOOT\".\r\nCreated \"/mnt/boot/loader\".\r\nCreated \"/mnt/boot/loader/entries\".\r\nCreated \"/mnt/boot/EFI/Linux\".\r\nCopied \"/usr/lib/systemd/boot/efi/systemd-bootx64.efi\" to \"/mnt/boot/EFI/systemd/systemd-bootx64.efi\".\r\nCopied \"/usr/lib/systemd/boot/efi/systemd-bootx64.efi\" to \"/mnt/boot/EFI/BOOT/BOOTX64.EFI\".\r\nRandom seed file /mnt/boot/loader/random-seed successfully written (32 bytes).\r\nNot installing system token, since we are running in a virtualized environment.\r\nCreated EFI boot entry \"Linux Boot Manager\".\r\n[root@archiso ~]# find /mnt/boot/\r\n/mnt/boot/\r\n/mnt/boot/NvVars\r\n/mnt/boot/EFI\r\n/mnt/boot/EFI/systemd\r\n/mnt/boot/EFI/systemd/systemd-bootx64.efi\r\n/mnt/boot/EFI/BOOT\r\n/mnt/boot/EFI/BOOT/BOOTX64.EFI\r\n/mnt/boot/EFI/Linux\r\n/mnt/boot/loader\r\n/mnt/boot/loader/entries\r\n/mnt/boot/loader/loader.conf\r\n/mnt/boot/loader/random-seed\r\n/mnt/boot/loader/entries.srel\r\n[root@archiso ~]# for file in $(find /mnt/fedora/boot/); do cp -v $file $(echo $file | sed -e 's/fedora//' -e 's/efi\\/EFI/EFI/'); done\r\n[root@archiso ~]# diff -y <(find /mnt/boot | sort) <(find /mnt/fedora/boot | sed -e 's/fedora\\///' -e 's/efi\\/EFI/EFI/' | sort)\r\n/mnt/boot /mnt/boot\r\n/mnt/boot/.vmlinuz-5.19.10-200.fc36.x86_64.hmac /mnt/boot/.vmlinuz-5.19.10-200.fc36.x86_64.hmac\r\n/mnt/boot/EFI /mnt/boot/EFI\r\n/mnt/boot/EFI/BOOT <\r\n/mnt/boot/EFI/BOOT/BOOTX64.EFI <\r\n/mnt/boot/EFI/Linux /mnt/boot/EFI/Linux\r\n/mnt/boot/EFI/Linux/linux-5.19.10-200.fc36.x86_64-d1a1c3d381b /mnt/boot/EFI/Linux/linux-5.19.10-200.fc36.x86_64-d1a1c3d381b\r\n/mnt/boot/EFI/systemd | /mnt/boot/EFI/fedora\r\n/mnt/boot/EFI/systemd/systemd-bootx64.efi <\r\n/mnt/boot/NvVars <\r\n/mnt/boot/System.map-5.19.10-200.fc36.x86_64 /mnt/boot/System.map-5.19.10-200.fc36.x86_64\r\n/mnt/boot/config-5.19.10-200.fc36.x86_64 /mnt/boot/config-5.19.10-200.fc36.x86_64\r\n > /mnt/boot/efi\r\n > /mnt/boot/grub2\r\n/mnt/boot/initramfs-5.19.10-200.fc36.x86_64.img /mnt/boot/initramfs-5.19.10-200.fc36.x86_64.img\r\n/mnt/boot/loader /mnt/boot/loader\r\n/mnt/boot/loader/entries /mnt/boot/loader/entries\r\n/mnt/boot/loader/entries.srel <\r\n/mnt/boot/loader/entries/d1a1c3d381b9405ab46417e3535ef1be-5.1 /mnt/boot/loader/entries/d1a1c3d381b9405ab46417e3535ef1be-5.1\r\n/mnt/boot/loader/loader.conf | /mnt/boot/symvers-5.19.10-200.fc36.x86_64.gz\r\n/mnt/boot/loader/random-seed <\r\n/mnt/boot/vmlinuz-5.19.10-200.fc36.x86_64 /mnt/boot/vmlinuz-5.19.10-200.fc36.x86_64\r\n```"
}
]
},
{
"body": "# 2022-09-26 Engineering Logs",
"replies": [
{
"body": "## 2022-09-26 @pdxjohnny Engineering Logs\r\n\r\n- Alice\r\n - State of the art updated to 98335d941116e76bbf4e07422adc2b5061e47934\r\n - Overlay of CI/CD library detection example: https://github.com/intel/dffml/commit/90d5c52f4dd64f046a2e2469d001e32ec2d53966\r\n\r\nInstall Alice: https://github.com/intel/dffml/tree/alice/entities/alice/\r\n\r\n```console\r\n$ python -m venv .venv\r\n$ . .venv/bin/activate\r\n$ python -m pip install -U pip setuptools wheel\r\n$ export ALICE_STATE_OF_ART=98335d941116e76bbf4e07422adc2b5061e47934\r\n$ python -m pip install \\\r\n \"[https://github.com/intel/dffml/archive/${ALICE_STATE_OF_ART}.zip#egg=dffml](https://github.com/intel/dffml/archive/$%7BALICE_STATE_OF_ART%7D.zip#egg=dffml)\" \\\r\n \"[https://github.com/intel/dffml/archive/${ALICE_STATE_OF_ART}.zip#egg=dffml-feature-git&subdirectory=feature/git](https://github.com/intel/dffml/archive/$%7BALICE_STATE_OF_ART%7D.zip#egg=dffml-feature-git&subdirectory=feature/git)\" \\\r\n \"[https://github.com/intel/dffml/archive/${ALICE_STATE_OF_ART}.zip#egg=shouldi&subdirectory=examples/shouldi](https://github.com/intel/dffml/archive/$%7BALICE_STATE_OF_ART%7D.zip#egg=shouldi&subdirectory=examples/shouldi)\" \\\r\n \"[https://github.com/intel/dffml/archive/${ALICE_STATE_OF_ART}.zip#egg=dffml-config-yaml&subdirectory=configloader/yaml](https://github.com/intel/dffml/archive/$%7BALICE_STATE_OF_ART%7D.zip#egg=dffml-config-yaml&subdirectory=configloader/yaml)\" \\\r\n \"[https://github.com/intel/dffml/archive/${ALICE_STATE_OF_ART}.zip#egg=dffml-operations-innersource&subdirectory=operations/innersource](https://github.com/intel/dffml/archive/$%7BALICE_STATE_OF_ART%7D.zip#egg=dffml-operations-innersource&subdirectory=operations/innersource)\" \\\r\n \"[https://github.com/intel/dffml/archive/${ALICE_STATE_OF_ART}.zip#egg=alice&subdirectory=entities/alice](https://github.com/intel/dffml/archive/$%7BALICE_STATE_OF_ART%7D.zip#egg=alice&subdirectory=entities/alice)\"\r\n```\r\n\r\nInstall this overlay (from this commit in this example):\r\n\r\n```console\r\n$ python -m pip install --force-reinstall --upgrade \"git+https://github.com/intel/dffml@d2a38d47445241fc99d26bc2a51184caa88bd033#subdirectory=entities/alice\"\r\n```\r\n\r\nCollect metrics on a repo using `alice shouldi contribute`:\r\n\r\n```console\r\n$ alice -log debug shouldi contribute -keys https://github.com/pdxjohnny/httptest 2>&1 | tee .alice.shouldi.contribute.log.$(date \"+%4Y-%m-%d-%H-%M\").txt\r\n$ alice -log debug shouldi contribute -record-def GitHubRepoID -keys 149512216 2>&1 | tee .alice.shouldi.contribute.log.$(date \"+%4Y-%m-%d-%H-%M\").txt\r\n$ python -c 'import yaml, json, sys; print(yaml.dump(json.load(sys.stdin)))' < .tools/open-architecture/innersource/repos.json\r\nuntagged:\r\n https://github.com/aliceoa/example-github-action:\r\n features:\r\n alice.shouldi.contribute.cicd:cicd_action_library:\r\n result: true\r\n group_by:\r\n ActionYAMLFileWorkflowUnixStylePath:\r\n - my_action_name/action.yml\r\n```\r\n\r\n- Generating JSON schema\r\n - https://pydantic-docs.helpmanual.io/usage/schema/\r\n - https://pydantic-docs.helpmanual.io/install/\r\n - https://pydantic-docs.helpmanual.io/usage/model_config/\r\n - https://pydantic-docs.helpmanual.io/usage/schema/#schema-customization\r\n - Initial commit: 168a3e26c62d7e0c8dd92b1761ec5fad273fb9c6\r\n - Added `$schema` to make output schema a valid Manifest schema per ADR requirements\r\n - https://github.com/intel/dffml/blob/alice/docs/arch/0008-Manifest.md\r\n- KERI\r\n - https://keri.one\r\n - https://humancolossus.foundation/blog/thinking-of-did-keri-on/keri-resources/\r\n- References\r\n - https://open-music.org/\r\n - https://github.com/fzipp/gocyclo\r\n - > Calculate cyclomatic complexities of functions in Go source code.\r\n\r\n```console\r\n$ curl -sfL https://github.com/intel/dffml/ | grep octolytics-dimension-repository_id\r\n <meta name=\"octolytics-dimension-user_id\" content=\"17888862\" /><meta name=\"octolytics-dimension-user_login\" content=\"intel\" /><meta name=\"octolytics-dimension-repository_id\" content=\"149512216\" /><meta name=\"octolytics-dimension-repository_nwo\" content=\"intel/dffml\" /><meta name=\"octolytics-dimension-repository_public\" content=\"true\" /><meta name=\"octolytics-dimension-repository_is_fork\" content=\"false\" /><meta name=\"octolytics-dimension-repository_network_root_id\" content=\"149512216\" /><meta name=\"octolytics-dimension-repository_network_root_nwo\" content=\"intel/dffml\" />\r\ncoder@coder-john-s-andersen-alice:/src/dffml$ curl -sfL https://github.com/intel/dffml/ | grep octolytics-dimension-repository_id | sed -e 's/octolytics-dimension-repository_id\" content=\"//'\r\n <meta name=\"octolytics-dimension-user_id\" content=\"17888862\" /><meta name=\"octolytics-dimension-user_login\" content=\"intel\" /><meta name=\"149512216\" /><meta name=\"octolytics-dimension-repository_nwo\" content=\"intel/dffml\" /><meta name=\"octolytics-dimension-repository_public\" content=\"true\" /><meta name=\"octolytics-dimension-repository_is_fork\" content=\"false\" /><meta name=\"octolytics-dimension-repository_network_root_id\" content=\"149512216\" /><meta name=\"octolytics-dimension-repository_network_root_nwo\" content=\"intel/dffml\" />\r\ncoder@coder-john-s-andersen-alice:/src/dffml$ curl -sfL https://github.com/intel/dffml/ | grep octolytics-dimension-repository_id | sed -e 's/.*octolytics-dimension-repository_id\" content=\"//'\r\n149512216\" /><meta name=\"octolytics-dimension-repository_nwo\" content=\"intel/dffml\" /><meta name=\"octolytics-dimension-repository_public\" content=\"true\" /><meta name=\"octolytics-dimension-repository_is_fork\" content=\"false\" /><meta name=\"octolytics-dimension-repository_network_root_id\" content=\"149512216\" /><meta name=\"octolytics-dimension-repository_network_root_nwo\" content=\"intel/dffml\" />\r\ncoder@coder-john-s-andersen-alice:/src/dffml$ curl -sfL https://github.com/intel/dffml/ | grep octolytics-dimension-repository_id | sed -e 's/.*octolytics-dimension-repository_id\" content=\"//' -e 's/\".*//'\r\n149512216\r\ncoder@coder-john-s-andersen-alice:/src/dffml $ gh api https://api.github.com/repositories/149512216 | jq -r '.clone_url'\r\nhttps://github.com/intel/dffml.git\r\n```\r\n\r\nAdded GitHubRepoID to URL lookup via https://github.com/intel/dffml/commit/4d64f011ccdee8882adbc4b7447953c4416ceb64\r\n\r\nRun the metric collection\r\n\r\n```console\r\ncoder@coder-john-s-andersen-alice:/src/dffml$ alice -log debug shouldi contribute -record-def GitHubRepoID -keys 149512216\r\n```\r\n\r\nConvert to YAML for easy reading\r\n\r\n```console\r\n$ python -c 'import yaml, json, sys; print(yaml.dump(json.load(sys.stdin)))' < .tools/open-architecture/innersource/repos.json\r\nuntagged:\r\n https://github.com/trekhleb/javascript-algorithms:\r\n extra: {}\r\n features:\r\n dffml_operations_innersource.operations:badge_maintained:\r\n result: https://img.shields.io/badge/Maintainance-Active-green\r\n dffml_operations_innersource.operations:badge_unmaintained:\r\n result: https://img.shields.io/badge/Maintainance-Inactive-red\r\n group_by:\r\n GitHubActionsWorkflowUnixStylePath:\r\n - .github/workflows/CI.yml\r\n author_line_count:\r\n - Oleksii Trekhleb: 370\r\n bool:\r\n - true\r\n commit_shas:\r\n - d3c0ee6f7af3fce4a3a2bdc1c5be36d7c2d9793a\r\n release_within_period:\r\n - false\r\n key: https://github.com/trekhleb/javascript-algorithms\r\n last_updated: '2022-09-26T15:13:00Z'\r\n```\r\n\r\n- Accidentally force pushed\r\n - Enabled branch protected on the `alice` branch\r\n - Went to PR and looked for \"forced pushed\" in logs\r\n - Grabbed the commit and found the compare because we can download the patchset but it won't let us create a branch off it that we could tell\r\n - https://github.com/intel/dffml/compare/alice...0c4b8191b13465980ced3fd1ddfbea30af3d1104.patch\r\n - Downloaded with curl\r\n - `curl -sfLO https://github.com/intel/dffml/compare/alice...0c4b8191b13465980ced3fd1ddfbea30af3d1104.patch`\r\n - Removed the first patch which we rebase squashed other commits into\r\n - `vim alice...0c4b8191b13465980ced3fd1ddfbea30af3d1104.patch`\r\n - Apply patches (there were 15 after removing the collecting Jenkins patch)\r\n - `git am < alice...0c4b8191b13465980ced3fd1ddfbea30af3d1104.patch`\r\n\r\n```yaml\r\n check_if_valid_git_repository_URL:\r\n inputs:\r\n URL:\r\n - dffml_operations_innersource.cli:github_repo_id_to_clone_url: result\r\n - seed\r\n cleanup_git_repo:\r\n inputs:\r\n repo:\r\n - clone_git_repo: repo\r\n clone_git_repo:\r\n conditions:\r\n - check_if_valid_git_repository_URL: valid\r\n inputs:\r\n URL:\r\n - dffml_operations_innersource.cli:github_repo_id_to_clone_url: result\r\n - seed\r\n ssh_key:\r\n - seed\r\n count_authors:\r\n inputs:\r\n author_lines:\r\n - git_repo_author_lines_for_dates: author_lines\r\n dffml_feature_git.feature.operations:git_grep:\r\n inputs:\r\n repo:\r\n - clone_git_repo: repo\r\n search:\r\n - seed\r\n dffml_operations_innersource.cli:ensure_tokei:\r\n inputs: {}\r\n dffml_operations_innersource.cli:github_repo_id_to_clone_url:\r\n inputs:\r\n repo_id:\r\n - seed\r\n```\r\n\r\n- Ah, forgot to call `COLLECTOR_DATAFLOW.update_by_origin()`\r\n - We always forget about this, we should probably call `dataflow.update_by_origin()` by default on orchestrator context entry.\r\n- In progress on auto creation of JSON schema from single object or list of example objects\r\n\r\n```diff\r\ndiff --git a/configloader/jsonschema/tests/test_config.py b/configloader/jsonschema/tests/test_config.py\r\nindex ea4852862..2a0b9ffa1 100644\r\n--- a/configloader/jsonschema/tests/test_config.py\r\n+++ b/configloader/jsonschema/tests/test_config.py\r\n@@ -137,4 +137,6 @@ class TestConfig(AsyncTestCase):\r\n async with configloader() as ctx:\r\n original = {\"Test\": [\"dict\"]}\r\n reloaded = await ctx.loadb(await ctx.dumpb(original))\r\n+ from pprint import pprint\r\n+ pprint(reloaded)\r\n self.assertEqual(original, TEST_0_SCHEMA_SHOULD_BE)\r\n```\r\n\r\n```console\r\n$ python -m unittest discover -v\r\ntest_0_dumpb_loadb (tests.test_config.TestConfig) ... {'$schema': 'https://intel.github.io/dffml/manifest-format-name.0.0.2.schema.json',\r\n 'definitions': {'FooBar': {'properties': {'count': {'title': 'Count',\r\n 'type': 'integer'},\r\n 'size': {'title': 'Size',\r\n 'type': 'number'}},\r\n 'required': ['count'],\r\n 'title': 'FooBar',\r\n 'type': 'object'},\r\n 'Gender': {'description': 'An enumeration.',\r\n 'enum': ['male', 'female', 'other', 'not_given'],\r\n 'title': 'Gender',\r\n 'type': 'string'}},\r\n 'description': 'This is the description of the main model',\r\n 'properties': {'Gender': {'$ref': '#/definitions/Gender'},\r\n 'foo_bar': {'$ref': '#/definitions/FooBar'},\r\n 'snap': {'default': 42,\r\n 'description': 'this is the value of snap',\r\n 'exclusiveMaximum': 50,\r\n 'exclusiveMinimum': 30,\r\n 'title': 'The Snap',\r\n 'type': 'integer'}},\r\n 'required': ['foo_bar'],\r\n 'title': 'Main',\r\n 'type': 'object'}\r\nFAIL\r\n\r\n======================================================================\r\nFAIL: test_0_dumpb_loadb (tests.test_config.TestConfig)\r\n----------------------------------------------------------------------\r\nTraceback (most recent call last):\r\n File \"/src/dffml/dffml/util/asynctestcase.py\", line 115, in run_it\r\n result = self.loop.run_until_complete(coro(*args, **kwargs))\r\n File \"/.pyenv/versions/3.9.13/lib/python3.9/asyncio/base_events.py\", line 647, in run_until_complete\r\n return future.result()\r\n File \"/src/dffml/configloader/jsonschema/tests/test_config.py\", line 142, in test_0_dumpb_loadb\r\n self.assertEqual(original, TEST_0_SCHEMA_SHOULD_BE)\r\nAssertionError: {'Test': ['dict']} != {'title': 'Main', 'description': 'This is t[665 chars]g'}}}\r\nDiff is 1276 characters long. Set self.maxDiff to None to see it.\r\n\r\n----------------------------------------------------------------------\r\nRan 1 test in 0.005s\r\n\r\nFAILED (failures=1)\r\n```\r\n\r\n- TODO\r\n - [ ] Add option for output configloader similar to `-log` for all CLI commands.\r\n - [ ] Enables serialization of returned objects from `CMD.run()` methods into to arbitrary formats.\r\n - [ ] `JSONSchemaConfigLoaderConfig.multi: bool` could allow us to interpret the input as a set of inputs which the generated schema should conform to all."
},
{
"body": "# How Does W3C Work?\r\n\r\n- W3C groups are chartered for a set amount of time\r\n- https://w3c.github.io/did-use-cases/\r\n - WG will be focusing on interoperability\r\n - Ensure DID methods interoperate\r\n - Will try to define what does interoperability mean\r\n - Would be nice to have a schema for a did\r\n - Example: PKI Cert DID\r\n - Structure around how application would go about solving an authentication or authorization challenge\r\n - Could be made to work with zero knowledge proofs or other arbitrary methods\r\n - Point is largely to ensure you don't have to use centralized PKI\r\n- Vol 3: Politics\r\n - Sometimes folks object to continuing a WG charter on pollical or philosophical groups\r\n - WG members sometimes raise concern of opponents charters concerns on the grounds they want to preserve currently advantageous positions held due to lack of standards."
}
]
},
{
"body": "# 2022-09-27 Engineering Logs\r\n\r\n- SPDX 2.3\r\n - https://www.chainguard.dev/unchained/whats-new-in-spdx-2-3\r\n- DX\r\n - https://kenneth.io/post/developer-experience-infrastructure-dxi\r\n- IPVM\r\n - https://github.com/ipvm-wg/spec/discussions/3\r\n - https://github.com/ipvm-wg/spec/discussions/7\r\n - https://fission.codes/blog/ipfs-thing-breaking-down-ipvm/",
"replies": [
{
"body": "## 2022-09-27 @pdxjohnny Engineering Logs\r\n\r\n- Install plugin with subdirectory from commit from git\r\n - `python -m venv .venv`\r\n - `source .venv`\r\n - `python -m pip install --upgrade setuptools pip wheel`\r\n - `python -m pip install --upgrade \"git+https://github.com/intel/dffml@17ccb5b76f261d2725a64528e25669ef97920d70#subdirectory=entities/alice\"`\r\n - pypi proxy is how we enable manifest BOM component swap out for downstream validation within 2nd party CI setup (workaround for dependency links issue)\r\n - References\r\n - https://github.com/intel/dffml/pull/1207\r\n - https://github.com/intel/dffml/pull/1061\r\n - https://github.com/intel/dffml/discussions/1406#discussioncomment-3676224\r\n\r\n```\r\n$ dffml version\r\ndffml 0.4.0 /src/dffml/dffml 5c89b6780 (dirty git repo)\r\ndffml-config-yaml 0.1.0 /src/dffml/configloader/yaml/dffml_config_yaml 5c89b6780 (dirty git repo)\r\ndffml-config-image not installed\r\ndffml-config-jsonschema 0.0.1 /src/dffml/configloader/jsonschema/dffml_config_jsonschema 5c89b6780 (dirty git repo)\r\ndffml-model-scratch not installed\r\ndffml-model-scikit not installed\r\ndffml-model-tensorflow not installed\r\ndffml-model-tensorflow-hub not installed\r\ndffml-model-vowpalWabbit not installed\r\ndffml-model-xgboost not installed\r\ndffml-model-pytorch not installed\r\ndffml-model-spacy not installed\r\ndffml-model-daal4py not installed\r\ndffml-model-autosklearn not installed\r\ndffml-feature-git 0.3.0 /src/dffml/feature/git/dffml_feature_git 5c89b6780 (dirty git repo)\r\ndffml-feature-auth not installed\r\ndffml-operations-binsec not installed\r\ndffml-operations-data not installed\r\ndffml-operations-deploy not installed\r\ndffml-operations-image not installed\r\ndffml-operations-nlp not installed\r\ndffml-operations-innersource 0.0.1 /src/dffml/operations/innersource/dffml_operations_innersource 5c89b6780 (dirty git repo)\r\ndffml-service-http not installed\r\ndffml-source-mysql not installed\r\n```\r\n\r\n- Encourage and coordinate collaborative documentation of strategy and implementation as living documentation to help community communicate amongst itself and facilitate sync with potential users / other communities / aligned workstreams.\r\n- SCITT\r\n - https://github.com/pdxjohnny/use-cases/blob/openssf_metrics/openssf_metrics.md\r\n - https://github.com/ietf-scitt/use-cases/pull/18\r\n- Stream of Consciousness\r\n - Decentralized Web Node and Self-Sovereign Identity Service\r\n - https://github.com/TBD54566975/ssi-service/tree/main/sip/sips/sip4\r\n - https://forums.tbd.website/t/sip-4-discussion-dwn-message-processing/137\r\n - https://github.com/TBD54566975/ssi-service/pull/113\r\n - Gabe approved 17 minutes ago\r\n - Chaos smiles on us again\r\n - https://github.com/TBD54566975/ssi-service/blob/3869b8ef2808210201ae6c43e2e0956a85950fc6/pkg/dwn/dwn_test.go#L22-L58\r\n - https://identity.foundation/credential-manifest/\r\n - > For User Agents (e.g. wallets) and other service that wish to engage with Issuers to acquire credentials, there must exist a mechanism for assessing what inputs are required from a Subject to process a request for credential(s) issuance. The Credential Manifest is a common data format for describing the inputs a Subject must provide to an Issuer for subsequent evaluation and issuance of the credential(s) indicated in the Credential Manifest.\r\n >\r\n > Credential Manifests do not themselves define the contents of the output credential(s), the process the Issuer uses to evaluate the submitted inputs, or the protocol Issuers, Subjects, and their User Agents rely on to negotiate credential issuance.\r\n > \r\n > ![image](https://user-images.githubusercontent.com/5950433/192642680-627f9da6-ebb1-45b6-9872-7202e8b3fcaf.png)\r\n - In our distributed compute setup, credential issuance is the execution (which we had been looking at confirming the trades of via the tbDEX protocol, no work has been done on that front recently from DFFML side)\r\n - What they refer to as a \"Credential Manifest\" is similar to what we refer to as an \"Manifest Instance\".\r\n - https://github.com/intel/dffml/blob/alice/docs/arch/0008-Manifest.md\r\n - `SpecVersion` has all the properties we require of Manifests (see `$schema`) so we can indeed classify a \"Credential Manifest\" as a Manifest.\r\n - Alignment looking strong!\r\n - > ![image](https://user-images.githubusercontent.com/5950433/192644284-3cf55d65-ca00-4c25-98fa-babf1bfd945d.png)\r\n - https://github.com/TBD54566975/ssi-service/pull/113/files#diff-7926652f7b7153343e273a0a72f87cb0cdf4c3063ec912cdb95dc541a8f2785dR62"
},
{
"body": "## 2022-9-27: Day 1: Innovation Day 1 Keynote LIVE WEBCAST\r\n\r\n> Notes from webcast\r\n\r\n![image](https://user-images.githubusercontent.com/5950433/192823017-a3ec1a2d-4cd8-466b-a82b-71a977949943.png)\r\n\r\n![image](https://user-images.githubusercontent.com/5950433/192618679-43ecd987-def5-4799-90f6-9dc8f4d7d877.png)\r\n\r\n- Webcast: https://twitter.com/intel/status/1574492026988642317\r\n- Pat quotes\r\n - Committed to a strategy of building a more balanced and resilient supply chain for the world\r\n - We are torridly (\"full of intense emotion...\": https://en.wiktionary.org/wiki/torrid) moving to the future.\r\n - We will continue to be the stewards of Moore's law into the future\r\n - Intel, be my supply chain manager\r\n - Tech for good impact\r\n - Commitment to being open\r\n - Our collective potential as an industry is unleashed when we enable openness, choice, and trust\r\n - Our objective is that developers whether software or hardware you see the future,\r\n - and our job at Intel is to open that future up to you,\r\n - working together on open frameworks that you can trust.\r\n - I'm excited we have the opportunity to come together to learn, grow, build, challenge and help each other,\r\n - and together we've taken a peak into the future, one that we will create together.\r\n- https://cloud.intel.com\r\n - Developer cloud\r\n- Greg: \"Software, the soul of the machine\"\r\n - Software defined, silicon enhanced\r\n - Vibrant chipplet ecosystem\r\n - UCIe\r\n - Universal Chipplet Interconnect Express\r\n- Champion of open marketplace\r\n- auto optimization of xeon speedup 10x\r\n- https://geti.intel.com/ (end of year)\r\n- text to image demo\r\n - using latent diffusion\r\n - https://twitter.com/pdxjohnny/status/1572438573336662017?s=20&t=6rHO8ShUU0eIffdvcJzLPw\r\n - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0001_coach_alice/0000_introduction.md\r\n - Looks like we're accelerating\r\n- Alignment\r\n - \"Our objective is that developers whether software or hardware you see the future, and our job at Intel is to open that future up to you\" [Pat]\r\n - https://github.com/intel/dffml/blob/alice/docs/arch/alice/discussion/0036/reply_0013.md\r\n - \"Software, the soul of the machine\" [Greg]\r\n - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice/0000_architecting_alice#entity-analysis-trinity\r\n- TODO\r\n - [x] @pdxjohnny Reach out to Ria Cheruvu to see if she is interested in collaborating on Alice's ethics or other aspects.\r\n - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_preface.md#volume-5-alices-adventures-in-wonderland"
}
]
},
{
"body": "# 2022-09-28 Engineering Logs\r\n\r\n- Self-Sovereign Identity Service\r\n - https://github.com/TBD54566975/ssi-service/tree/main/sip\r\n- https://lu.ma/ipvm\r\n - Tuesday, October 11, 2022 9:00 AM-10:00 AM\r\n - > \u200bThis call is open to all, but is focused on implementers, following the IETF's rough \"consensus and running code\" ethos.\r\n >\r\n > \u200bThe IPVM is an effort to add content-addressed computation to IPFS. The requires specifying calling convention, distributed scheduling, session receipts, mobile computing, and auto-upgradable IPFS internals.\r\n >\r\n > - \u200bLinks\r\n > - \u200b[Community Calls](https://github.com/ipvm-wg/spec/discussions/categories/community-call)\r\n > - \u200b[GitHub Org](https://github.com/ipvm-wg)\r\n > - \u200b[Discord Channel](https://discord.gg/eudkhw9NQJ)\r\n > - \u200b[IPFS \u00feing '22 Slides](https://noti.st/expede/oq0ULd/ipvm-interplanetary-vm)\r\n >\r\n > > \u200bWasm modules, their arguments, intermediate states, their outputs, and managed effects can be described as IPLD graphs. IPVM is a strategy to support generalized deterministic computation in a serverless style on top of IPFS with optional side-channel matchmaking on Filecoin, and extend the same benefits of shared data blocks to computation.\r\n- GitHub Actions for downstream validation of 2nd party plugins.\r\n - Issue: Need container images running for some (`dffml-source-mysql` integration tests).\r\n - Use DERP to join running actions jobs.\r\n - Use privilege separation of two user accounts.\r\n - Credit to Matt for this idea came up with trying to make API token permission delegation more granular than what is currently supported, same role based copy user scheme.\r\n - Everything is terraform templates (coder, k8s), dockerfiles and actions workflows (coder setup-ssh and then do port forwarding, now you can spin up anything).\r\n - Those can all be described as dataflows and synthesized to\r\n - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_forward.md#supply-chain-security",
"replies": [
{
"body": "## 2022-09-28 @pdxjohnny Engineering Logs\r\n\r\n- Terraform\r\n - https://registry.terraform.io/providers/hashicorp/http/latest/docs/data-sources/http\r\n - https://registry.terraform.io/providers/hashicorp/kubernetes/latest/docs\r\n- VSCode\r\n - https://sourcegraph.com/search?q=repo:%5Egithub%5C.com/microsoft/.*+remotePlatform&patternType=standard\r\n - Goal: DERP remote connect to ssh coder\r\n - Tabled this for later\r\n - https://github.com/coder/coder/search?q=derp\r\n - They added support for a config option!\r\n - https://github.com/coder/coder/pull/4030\r\n - https://github.com/coder/coder/blob/7e54413d3b39d8da8cd404190739a7de35f467de/docs/networking.md\r\n - Tailscale added official docs on running DERP servers!\r\n - https://tailscale.com/kb/1118/custom-derp-servers/#why-run-your-own-derp-server\r\n - https://github.com/coder/coder/blob/7e54413d3b39d8da8cd404190739a7de35f467de/docs/networking/port-forwarding.md\r\n - https://github.dev/intel/dffml\r\n - https://github.com/microsoft/vscode/blob/236adc221bb31701db4c2a36ffed544653b26311/src/vs/workbench/contrib/welcomeGettingStarted/browser/gettingStarted.contribution.ts#L253-L285\r\n - https://github.com/microsoft/vscode-docs/blob/b0cc336a950effd3d5c012900a6ec1ba613fc8fb/docs/remote/troubleshooting.md\r\n - https://sourcegraph.com/search?q=context:global+repo:%5Egithub%5C.com/microsoft/.*+showLoginTerminal&patternType=standard\r\n - https://github.com/microsoft/vscode-cpptools/blob/ebb24763bd1143d9177a5fa6a7b70ade8c9f05ab/Extension/src/SSH/sshCommandRunner.ts\r\n - Seems like a vendored version of what we are looking for\r\n - https://github.com/microsoft/vscode/blob/0c22a33a9d670a84309447b36abdbd8c04ee6219/src/vs/workbench/services/remote/common/remoteAgentService.ts#L20\r\n - https://github.com/microsoft/vscode/blob/b7d5b65a13299083e92bca91be8fa1289e95d5c1/src/vs/workbench/services/remote/browser/remoteAgentService.ts#L22\r\n - https://github.com/microsoft/vscode/blob/b7d5b65a13299083e92bca91be8fa1289e95d5c1/src/vs/platform/remote/browser/browserSocketFactory.ts#L268\r\n- GitHub Actions for downstream validation of 2nd party plugins.\r\n - https://docs.github.com/en/actions/using-jobs/running-jobs-in-a-container\r\n - https://docs.github.com/en/actions/using-containerized-services/about-service-containers\r\n - docs: tutorials: rolling alice: forward: security: supply chain: Mention tie to distributed compute\r\n - https://github.com/intel/dffml/commit/e9af134d07f104e6db89ac872a8c2249198261da\r\n - https://twitter.com/pdxjohnny/status/1575152364440657920\r\n - https://twitter.com/pdxjohnny/status/1574974594863472640\r\n- Open Architecture\r\n - Threat Modeling\r\n - [FIRST](https://www.first.org/cvss/v2/team)\r\n - [Open SSF](https://openssf.org/)\r\n - https://openssf.org/oss-security-mobilization-plan/\r\n - Integration points\r\n - https://github.com/ossf/scorecard\r\n - https://github.com/ossf/criticality_score\r\n - https://github.com/ossf/osv-schema\r\n - Manual ask first, do you do threat modeling?\r\n - Eventually attestations / assertions\r\n - Get involved with risk assessment work in OpenSSF happening.\r\n - Lot's happening in ID security threats, stay engaged there.\r\n - Risk assessment work might land here.\r\n- Upstream communities which may be good places to show up and participate\r\n - OpenSSF Identifying Security Threats (still)\r\n- Similar\r\n - https://github.com/ossf/fuzz-introspector/blob/main/doc/Architecture.md\r\n - https://github.com/chaoss/wg-risk\r\n - https://github.com/chaoss/wg-risk/blob/main/focus-areas/dependency-risk-assessment/upstream-code-dependencies.md\r\n- CHAOSS Augur\r\n - https://github.com/chaoss/augur/blob/main/docker-compose.yml\r\n - https://github.com/chaoss/augur/blob/main/scripts/docker/docker-setup-database.sh\r\n - https://github.com/chaoss/augur/pkgs/container/augur_backend\r\n - https://oss-augur.readthedocs.io/en/main/getting-started/installation.html\r\n - https://oss-augur.readthedocs.io/en/main/development-guide/workers/creating_a_new_worker.html\r\n\r\n![initial-sketch-of-abstract-compute-architecture](https://user-images.githubusercontent.com/5950433/196192835-3a6ddb72-6a52-4043-bb6c-348382f2fcac.jpeg)\r\n\r\n- TODO\r\n - [ ] `CITATIONS.cff` demo\r\n - https://github.com/intel/dffml/discussions/1406#discussioncomment-3510908\r\n - https://securitytxt.org/ RFC 9116"
},
{
"body": "## Quotes\r\n\r\n- \u201cI thrive in Chaos. Its beyond Chaos\u201d [Alice]"
}
]
},
{
"body": "# 2022-09-29 Engineering Logs",
"replies": [
{
"body": "## 2022-09-29 @pdxjohnny Engineering Logs\r\n\r\n- SPIFFE\r\n - https://github.com/spiffe/spire/issues/1003\r\n- rekor\r\n - https://github.com/sigstore/rekor/blob/main/docker-compose.yml\r\n- Open Policy Agent\r\n - https://github.com/transmute-industries/did-eqt/blob/main/docs/did-eqt-opa-primer.md\r\n- Great org README\r\n - https://github.com/transmute-industries\r\n- Verifiable Data TypeScript Library\r\n - https://github.com/transmute-industries/verifiable-data\r\n- Sidetree\r\n - https://identity.foundation/sidetree/spec/\r\n - > ![sidetree-arch](https://identity.foundation/sidetree/spec/diagrams/sidetree-system.svg)\r\n >\r\n > #### [DID State Patches](https://identity.foundation/sidetree/spec/#did-state-patches)\r\n > Sidetree defines a delta-based [Conflict-Free Replicated Data Type](https://en.wikipedia.org/wiki/Conflict-free_replicated_data_type) system, wherein the metadata in a Sidetree-based implementation is controlled by the cryptographic PKI material of individual entities in the system, represented by DIDs. While the most common form of state associated with the DIDs in a Sidetree-based implementation is a [DID Document](https://w3c.github.io/did-core/), Sidetree can be used to maintain any type of DID-associated state.\r\n >\r\n > Sidetree specifies a general format for patching the state associated with a DID, called Patch Actions, which define how to deterministic mutate a DID\u2019s associated state. Sidetree further specifies a standard set of Patch Actions (below) implementers MAY use to facilitate DID state patching within their implementations. Support of the standard set of Patch Actions defined herein IS NOT required, but implementers MUST use the Patch Action format for defining patch mechanisms within their implementation. The general Patch Action format is defined as follows:\r\n > - `add-public-keys`\r\n > - `remove-public-keys`\r\n > - `add-services`\r\n > - `remove-services`\r\n > - `ietf-json-patch`\r\n >\r\n > #### [Proof of Fee](https://identity.foundation/sidetree/spec/#proof-of-fee)\r\n >\r\n > [NOTE](https://identity.foundation/sidetree/spec/#note-6) This section is non-normative\r\n >\r\n > Sidetree implementers MAY choose to implement protective mechanisms designed to strengthen a Sidetree network against low-cost spurious operations. These mechanisms are primarily designed for open, permissionless implementations utilizing public blockchains that feature native crypto-economic systems.\r\n- GitHub Actions\r\n - https://docs.github.com/en/actions/using-jobs/running-jobs-in-a-container\r\n - https://docs.github.com/en/actions/using-containerized-services/about-service-containers\r\n - https://github.com/jenkinsci/custom-war-packager/issues/173\r\n- Misc. diffs lying around\r\n\r\n```diff\r\ndiff --git a/dffml/df/base.py b/dffml/df/base.py\r\nindex 4f84c1c7c..1303e41c4 100644\r\n--- a/dffml/df/base.py\r\n+++ b/dffml/df/base.py\r\n@@ -15,11 +15,12 @@ from typing import (\r\n Union,\r\n Optional,\r\n Set,\r\n+ ContextManager,\r\n )\r\n from dataclasses import dataclass, is_dataclass, replace\r\n from contextlib import asynccontextmanager\r\n \r\n-from .exceptions import NotOpImp\r\n+from .exceptions import NotOpImp, RetryOperationException\r\n from .types import (\r\n Operation,\r\n Input,\r\n@@ -94,6 +95,7 @@ class OperationImplementationContext(BaseDataFlowObjectContext):\r\n self.parent = parent\r\n self.ctx = ctx\r\n self.octx = octx\r\n+ self.op_retries = None\r\n \r\n @property\r\n def config(self):\r\n@@ -102,6 +104,31 @@ class OperationImplementationContext(BaseDataFlowObjectContext):\r\n \"\"\"\r\n return self.parent.config\r\n \r\n+\r\n+ @asynccontextmanager\r\n+ async def raiseretry(self, retries: int) -> ContextManager[None]:\r\n+ \"\"\"\r\n+ Use this context manager to have the orchestrator call the operation's\r\n+ ``run()`` method multiple times within the same\r\n+ OperationImplementationContext entry.\r\n+\r\n+ Useful for\r\n+\r\n+ TODO\r\n+\r\n+ - Backoff\r\n+\r\n+ >>> def myop(self):\r\n+ ... with self.raiseretry(5):\r\n+ ... if self.op_current_retry < 4:\r\n+ ... raise Exception()\r\n+ \"\"\"\r\n+ try:\r\n+ yield\r\n+ except Exception as error:\r\n+ raise RetryOperationException(retries) from error\r\n+\r\n+\r\n @abc.abstractmethod\r\n async def run(self, inputs: Dict[str, Any]) -> Union[bool, Dict[str, Any]]:\r\n \"\"\"\r\ndiff --git a/dffml/df/exceptions.py b/dffml/df/exceptions.py\r\nindex 3ec596d6c..06606a3f8 100644\r\n--- a/dffml/df/exceptions.py\r\n+++ b/dffml/df/exceptions.py\r\n@@ -32,3 +32,8 @@ class ValidatorMissing(Exception):\r\n \r\n class MultipleAncestorsFoundError(NotImplementedError):\r\n pass\r\n+\r\n+\r\n+class RetryOperationException(Exception):\r\n+ def __init__(self, retires: int) -> None:\r\n+ self.retires = retires\r\ndiff --git a/dffml/df/memory.py b/dffml/df/memory.py\r\nindex f6f15f5a0..740fc7614 100644\r\n--- a/dffml/df/memory.py\r\n+++ b/dffml/df/memory.py\r\n@@ -27,6 +27,7 @@ from .exceptions import (\r\n ValidatorMissing,\r\n MultipleAncestorsFoundError,\r\n NoInputsWithDefinitionInContext,\r\n+ RetryOperationException,\r\n )\r\n from .types import (\r\n Input,\r\n@@ -39,6 +40,7 @@ from .types import (\r\n from .base import (\r\n OperationException,\r\n OperationImplementation,\r\n+ OperationImplementationContext,\r\n FailedToLoadOperationImplementation,\r\n BaseDataFlowObject,\r\n BaseDataFlowObjectContext,\r\n@@ -1190,6 +1192,7 @@ class MemoryOperationImplementationNetworkContext(\r\n ctx: BaseInputSetContext,\r\n octx: BaseOrchestratorContext,\r\n operation: Operation,\r\n+ opctx: OperationImplementationContext,\r\n inputs: Dict[str, Any],\r\n ) -> Union[bool, Dict[str, Any]]:\r\n \"\"\"\r\n@@ -1198,9 +1201,7 @@ class MemoryOperationImplementationNetworkContext(\r\n # Check that our network contains the operation\r\n await self.ensure_contains(operation)\r\n # Create an opimp context and run the operation\r\n- async with self.operations[operation.instance_name](\r\n- ctx, octx\r\n- ) as opctx:\r\n+ with contextlib.nullcontext():\r\n self.logger.debug(\"---\")\r\n self.logger.debug(\r\n \"%s Stage: %s: %s\",\r\n@@ -1251,22 +1252,28 @@ class MemoryOperationImplementationNetworkContext(\r\n \"\"\"\r\n Run an operation in our network.\r\n \"\"\"\r\n- if not operation.retry:\r\n- return await self.run_no_retry(ctx, octx, operation, inputs)\r\n- for retry in range(0, operation.retry):\r\n- try:\r\n- return await self.run_no_retry(ctx, octx, operation, inputs)\r\n- except Exception:\r\n- # Raise if no more tries left\r\n- if (retry + 1) == operation.retry:\r\n- raise\r\n- # Otherwise if there was an exception log it\r\n- self.logger.error(\r\n- \"%r: try %d: %s\",\r\n- operation.instance_name,\r\n- retry + 1,\r\n- traceback.format_exc().rstrip(),\r\n- )\r\n+ async with self.operations[operation.instance_name](\r\n+ ctx, octx\r\n+ ) as opctx:\r\n+ opctx.retries = operation.retry\r\n+ for retry in range(0, operation.retry):\r\n+ try:\r\n+ return await self.run_no_retry(ctx, octx, operation, opctx, inputs)\r\n+ except Exception:\r\n+ if isinstance(error, RetryOperationException):\r\n+ retries = error.retries\r\n+ if not retries:\r\n+ raise\r\n+ # Raise if no more tries left\r\n+ if (retry + 1) == retries:\r\n+ raise\r\n+ # Otherwise if there was an exception log it\r\n+ self.logger.error(\r\n+ \"%r: try %d: %s\",\r\n+ operation.instance_name,\r\n+ retry + 1,\r\n+ traceback.format_exc().rstrip(),\r\n+ )\r\n \r\n async def operation_completed(self):\r\n await self.completed_event.wait()\r\ndiff --git a/entities/alice/alice/please/contribute/recommended_community_standards/readme.py b/entities/alice/alice/please/contribute/recommended_community_standards/readme.py\r\nindex 437601358..836d8f175 100644\r\n--- a/entities/alice/alice/please/contribute/recommended_community_standards/readme.py\r\n+++ b/entities/alice/alice/please/contribute/recommended_community_standards/readme.py\r\n@@ -183,10 +183,11 @@ class OverlayREADME:\r\n \"\"\"\r\n Use the issue title as the pull request title\r\n \"\"\"\r\n- async for event, result in dffml.run_command_events(\r\n- [\"gh\", \"issue\", \"view\", \"--json\", \"title\", \"-q\", \".title\", readme_issue,],\r\n- logger=self.logger,\r\n- events=[dffml.Subprocess.STDOUT],\r\n- ):\r\n- if event is dffml.Subprocess.STDOUT:\r\n- return result.strip().decode()\r\n+ with self.raiseretry(5):\r\n+ async for event, result in dffml.run_command_events(\r\n+ [\"gh\", \"issue\", \"view\", \"--json\", \"title\", \"-q\", \".title\", readme_issue,],\r\n+ logger=self.logger,\r\n+ events=[dffml.Subprocess.STDOUT],\r\n+ ):\r\n+ if event is dffml.Subprocess.STDOUT:\r\n+ return result.strip().decode()\r\ndiff --git a/source/mongodb/dffml_source_mongodb/source.py b/source/mongodb/dffml_source_mongodb/source.py\r\nindex 01621851e..656524d75 100644\r\n--- a/source/mongodb/dffml_source_mongodb/source.py\r\n+++ b/source/mongodb/dffml_source_mongodb/source.py\r\n@@ -19,6 +19,7 @@ class MongoDBSourceConfig:\r\n collection: str = None\r\n tlsInsecure: bool = False\r\n log_collection_names: bool = False\r\n+ bypass_document_validation: bool = False\r\n\r\n def __post_init__(self):\r\n uri = urllib.parse.urlparse(self.uri)\r\n@@ -36,6 +37,7 @@ class MongoDBSourceContext(BaseSourceContext):\r\n {\"_id\": record.key},\r\n {\"_id\": record.key, **record.export()},\r\n upsert=True,\r\n+ bypass_document_validation=self.parent.config.bypass_document_validation,\r\n )\r\n\r\n def document_to_record(self, document, key=None):\r\n```"
},
{
"body": "## 2022-09-29 IETF SCITT Technical Meeting\r\n\r\n- Meeting Invite for SCITT Technical Meeting\r\n - https://armltd.zoom.us/j/95609091197?pwd=V3NndVF1WGZzNUJDUGUzcEVWckxOdz09 \r\n - Meeting ID: 956 0909 1197 \r\n - Passcode: 65442 four\r\n - +442034815240,,95609091197#,,,,*654424# United Kingdom\r\n- Yogesh Deshpande sent this out pre meeting on the mailing list:\r\n - SCITT Technical Meeting Agenda\r\n - Use Case Discussion\r\n - Threat Model Discussions\r\n - Link to Technical Notes Documents:\r\n - https://docs.google.com/document/d/1euqijlS2EgZysIfjMrisyzWTPwTUsxSZ5j_eVNXOmWA/edit\r\n- Joe\r\n - Working with Mike at [MSR] (Microsoft?)\r\n- Architecture Misc. Related (not discussed)\r\n - https://github.com/ietf-scitt/draft-birkholz-scitt-architecture/issues/24\r\n - RATs to SCITT terminology mapping to date\r\n- Last time\r\n - Didn't get into threat model discussion\r\n- Use cases\r\n - [Hardware / Microelectronics Use Case](https://github.com/ietf-scitt/use-cases/blob/main/hardware_microelectronics.md)\r\n - [DRAFT SBOM Use Case](https://github.com/rjb4standards/Presentations/raw/master/2022-0912-SBOM%20Use%20Case.pdf)\r\n - [DRAFT Software Supply Chain Artifact Examples](https://github.com/or13/use-cases/blob/59f8623abc3c351125fc097ac56cf88ae8ea2f1b/software_artifact_examples.md)\r\n - [DRAFT OpenSSF Metrics](https://github.com/pdxjohnny/use-cases/blob/openssf_metrics/openssf_metrics.md)\r\n - This is the one we're most closely (timeline wise) connected to.\r\n- SBOM use case aligns closely with NIST guidelines\r\n- What's in the registry\r\n - Is it the Signed SBOM itself? No, it's the attestation from the notary (gatekeeper)\r\n - The notary has the permissions to insert\r\n - What goes on chain is an assertion\r\n- Consumers have no way to verify the digitality signed object \r\n - They should be able to submit the digitality signed object (content addressable) a query registries and determine trust via inspection of notary claims within the registry.\r\n - To see if the entity has been registered\r\n- Example: Produce new version of embed TLS\r\n- SBOMs need to go in registry with other trusted data\r\n - We need many different factors in determining trust\r\n - We can insert more than just notarizations around SBOMs\r\n- Orie: Let's focus on single registry use cases for now\r\n - Two permissions models we'll focus on\r\n - https://github.com/ietf-scitt/draft-birkholz-scitt-architecture/issues/25\r\n - Public read, private write\r\n - Probably more complex policies would be active here (closer to full R/W)\r\n - private read, private write\r\n - Policy layer\r\n - If inputs are always hashes, then how do you make sense of should you accept it or not?\r\n - If the claims are rich, the policy can be rich (in terms of what can be applied).\r\n - You might have to go to an auditor, then it's a private read scenarios (DID resolution with UCAN auth for example)\r\n - What kind of policy could we apply to claims, or might want to apply to claims\r\n - https://github.com/ietf-scitt/draft-birkholz-scitt-architecture/issues/26\r\n- Situation where data is not notarized\r\n - Just sent as a package of requirements from end customer\r\n - We have to comply with their data requirements, customer maintains the trusted registry \r\n- On insert\r\n - Have to auth that signature on COSE sign 1 is from the entity from the header\r\n - COSE header tells you claims\r\n - Content Types tell you what the payload is\r\n - SCITT instance could use policy to validate\r\n - https://github.com/transmute-industries/did-eqt/blob/main/docs/did-eqt-opa-primer.md#securing-did-method-operations-with-opa\r\n - Alignment here with previous Open Architecture train of thought\r\n - [2022-07-20 Identifying Security Threats WG](https://github.com/intel/dffml/discussions/1406#discussioncomment-3191292)\r\n - [2022-07-19 @pdxjohnny Engineering Logs](https://github.com/intel/dffml/discussions/1406#discussioncomment-3181956)\r\n - [2022-07-25 Supply Chain Integrity, Transparency and Trust (SCITT)](https://github.com/intel/dffml/discussions/1406#discussioncomment-3223361)\r\n- Receipts are a critical part of this\r\n - SCITT implementation is required to produce and independently verifiable cryptographic receipt\r\n - You get back a (effectively countersignature), its been registered, it's tamper proof\r\n - You don't have to query\r\n - It's then independently verifiable, it carries the proff with it\r\n - Its' in the draft 1 for the architecture and it's been in Sylvan Clebesch team's work implementation wise and in the draft of the receipts doc.\r\n - https://datatracker.ietf.org/doc/draft-birkholz-scitt-architecture/\r\n - https://datatracker.ietf.org/doc/draft-birkholz-scitt-receipts/\r\n- Dick: Looking for agreement on:\r\n - Is there a\r\n - Notary?\r\n - Registry?\r\n - etc.\r\n- Dick: Looking for agreement on objective function agreement:\r\n - Give consumers a means to verify a digitally signed object\r\n - It should include any claims that it is trustworthy\r\n- Roy: All we know is it was valid at the time it was signed\r\n - Notary: Monty was Monty at the time you signed this\r\n- Authenticode signs with two different signatures so if they have to they can revoke one and roll it\r\n- Open Source Software\r\n - We'll be inserting things as we build them sometimes via self notarization\r\n- Yogesh\r\n - Rebuilding binary exact would allow for others to notarize build process without attested compute\r\n - Fully Private\r\n - Fully Public\r\n - Designated roles have access\r\n - We don't want to restrict our work to a specific deployment\r\n - Notary has a role to play but we would like to make it a nice to have on to of existing\r\n - Revisit this, Roy and John see notery as critical\r\n - What are the levels of auditing we want to be done\r\n - I have a receipt, I know that it's policy has been met\r\n - What is the next level of auditing you want?\r\n - There may be compute or other cost associated with going another level deep of auditing.\r\n- Monty: TCG forums have considerable interest in understanding firmware (TPM, etc.)\r\n - SBOM like \"manifests\"\r\n- We are still focusing on software as the core use case.\r\n - When the right time comes, we can open the architecture to other ecosystems\r\n - The agreement at Philly was focus will be on software but we will architect it such that it could include hardware. We will when the right time comes\r\n - We are doing it in a generic way. it could be used in other scenarios, we want to not pidgin hole into one vertical.\r\n- Orie: Defect in certain verifiable data systems (ones that log every interaction)\r\n - In certain high security systems even a read is a write!\r\n - This could be expensive in a public read scenario\r\n - Cost associated with cold storage evaluation raises interesting questions\r\n - Related to distributed compute\r\n - https://twitter.com/pdxjohnny/status/1575152364440657920\r\n - https://identity.foundation/sidetree/spec/#proof-of-fee\r\n - [2022-09-29 @pdxjohnny Engineering Logs](https://github.com/intel/dffml/discussions/1406#discussioncomment-3763478)\r\n- Read receipt\r\n - I did a query at this point of time\r\n - Proof of the most recent read of something\r\n - Threat model: Is there a Time of Check Time of Use here?\r\n - What if you need proof someone did a read?\r\n- TODO\r\n - [ ] Sequence diagram for Notary and Verifier\r\n - https://github.com/ietf-scitt/draft-birkholz-scitt-architecture/issues/27\r\n - [ ] @pdxjohnny: Update these notes with references to async tbDEX contract notes from Alice thread around audit level.\r\n - For future discussion \r\n - [ ] Dick: Definition on mailing list for what we are hashing against (file data stream?)\r\n - Critical for content addressability\r\n - We need to be careful of hashing compressed or decompressed objects"
}
]
},
{
"body": "# 2022-09-30 Engineering Logs",
"replies": [
{
"body": "## 2022-09-30 @pdxjohnny Engineering Logs\r\n\r\n- in-toto\r\n - manifest\r\n - https://docs.sigstore.dev/cosign/attestation\r\n- GitHub Actions\r\n - https://github.blog/2022-04-07-slsa-3-compliance-with-github-actions/\r\n - https://github.blog/2021-12-06-safeguard-container-signing-capability-actions/\r\n - https://docs.github.com/en/actions/deployment/security-hardening-your-deployments/configuring-openid-connect-in-cloud-providers#adding-permissions-settings\r\n - https://docs.github.com/en/actions/deployment/security-hardening-your-deployments/configuring-openid-connect-in-cloud-providers#requesting-the-jwt-using-environment-variables\r\n - https://github.com/slsa-framework/slsa-github-generator/blob/main/.github/workflows/generator_container_slsa3.yml\r\n - https://security.googleblog.com/2022/04/improving-software-supply-chain.html\r\n - https://docs.sigstore.dev/fulcio/oidc-in-fulcio/#oidc-token-requirements-with-extracted-claims\r\n - https://docs.sigstore.dev/cosign/openid_signing/#custom-infrastructure\r\n\r\n> For example:\r\n\r\n```yaml\r\njobs:\r\n job:\r\n runs-on: ubuntu-latest\r\n steps:\r\n - uses: actions/github-script@v6\r\n id: script\r\n timeout-minutes: 10\r\n with:\r\n debug: true\r\n script: |\r\n const token = process.env['ACTIONS_RUNTIME_TOKEN']\r\n const runtimeUrl = process.env['ACTIONS_ID_TOKEN_REQUEST_URL']\r\n core.setOutput('TOKEN', token.trim())\r\n core.setOutput('IDTOKENURL', runtimeUrl.trim())\r\n```\r\n\r\n> You can then use curl to retrieve a JWT from the GitHub OIDC provider. For example:\r\n\r\n```yaml\r\n - run: |\r\n IDTOKEN=$(curl -H \"Authorization: bearer $\" $ -H \"Accept: application/json; api-version=2.0\" -H \"Content-Type: application/json\" -d \"{}\" | jq -r '.value')\r\n echo $IDTOKEN\r\n jwtd() {\r\n if [[ -x $(command -v jq) ]]; then\r\n jq -R 'split(\".\") | .[0],.[1] | @base64d | fromjson' <<< \"${1}\"\r\n echo \"Signature: $(echo \"${1}\" | awk -F'.' '{print $3}')\"\r\n fi\r\n }\r\n jwtd $IDTOKEN\r\n echo \"::set-output name=idToken::${IDTOKEN}\"\r\n id: tokenid\r\n```\r\n\r\n- References\r\n - https://docs.github.com/en/actions/deployment/security-hardening-your-deployments/about-security-hardening-with-openid-connect#customizing-the-token-claims\r\n - https://docs.sigstore.dev/fulcio/oidc-in-fulcio#oidc-token-requirements-with-extracted-claims\r\n\r\n![image](https://user-images.githubusercontent.com/5950433/193351919-a3ab6573-e92d-4cc4-9edc-ccf8142e6129.png)\r\n\r\n- SPIFFE\r\n - https://docs.sigstore.dev/security/\r\n - > #### Proving Identity in Sigstore\r\n > Sigstore relies on the widely used OpenID Connect (OIDC) protocol to prove identity. When running something like cosign sign, users will complete an OIDC flow and authenticate via an identity provider (GitHub, Google, etc.) to prove they are the owner of their account. Similarly, automated systems (like GitHub Actions) can use Workload Identity or [SPIFFE](https://spiffe.io/) Verifiable Identity Documents (SVIDs) to authenticate themselves via OIDC. The identity and issuer associated with the OIDC token is embedded in the short-lived certificate issued by Sigstore\u2019s Certificate Authority, Fulcio.\r\n- fulcio\r\n - https://docs.sigstore.dev/fulcio/oidc-in-fulcio#supported-oidc-token-issuers\r\n- TODO\r\n - [ ] Write the wave (weekly sync meetings and rolling alice engineering logs), correlate the asciinema and the DFFML codebase, leverage CodeGen\r\n - https://github.com/salesforce/CodeGen"
},
{
"body": "## 2022-09-28 Andrew Ng's Intel Innovation Luminary Keynote Notes\r\n\r\n- References\r\n - \"joint AI Developer Program where developers can train, test, and deploy their AI models.\"\r\n - https://twitter.com/intel/status/1575221403409866752\r\n - https://www.intel.com/content/www/us/en/newsroom/news/2022-intel-innovation-day-2-livestream-replay.html#gs.djq36o\r\n - https://datacentricai.org/\r\n - Datasheets for Datasets\r\n - https://arxiv.org/abs/1803.09010\r\n - > The machine learning community currently has no standardized process for documenting datasets, which can lead to severe consequences in high-stakes domains. To address this gap, we propose datasheets for datasets. In the electronics industry, every component, no matter how simple or complex, is accompanied with a datasheet that describes its operating characteristics, test results, recommended uses, and other information. By analogy, we propose that every dataset be accompanied with a datasheet that documents its motivation, composition, collection process, recommended uses, and so on. Datasheets for datasets will facilitate better communication between dataset creators and dataset consumers, and encourage the machine learning community to prioritize transparency and accountability.\r\n- AI = Code + Data \r\n - The code is a solved problem!!! Get it off GitHub or something!\r\n\r\n![image](https://user-images.githubusercontent.com/5950433/193328916-b9232099-79b1-4c3d-9b7a-768822249630.png)\r\n\r\n- Slides\r\n - Data-Centric AI\r\n - is the discipline of systematically engineering the data used to build an AI system\r\n - (This is what we're doing with Alice)\r\n\r\n![image](https://user-images.githubusercontent.com/5950433/193330714-4bcceea4-4402-468f-82a9-51882939452c.png)\r\n\r\n---\r\n\r\n- Alignment\r\n - The iterative process of ML development\r\n - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice/0000_architecting_alice#entity-analysis-trinity\r\n - Intent / Train model\r\n - Establish correlations between threat model intent and collected data / errors (telemetry or static analysis, policy, failures)\r\n - Dynamic analysis / Improve data\r\n - We tweak the code to make it do different things to see different data. The application of overlays. Think over time.\r\n - Static / Error analysis\r\n - There might be async debug initiated here but this maps pretty nicely conceptually since we'd think of this as a static process, we already have some errors to analyze if we're at this step.\r\n\r\n![Entity Analysis Trinity](https://user-images.githubusercontent.com/5950433/188203911-3586e1af-a1f6-434a-8a9a-a1795d7a7ca3.svg)"
}
]
},
{
"body": "# 2022-10-02 Engineering Logs",
"replies": [
{
"body": "## 2022-10-02 @pdxjohnny Engineering Logs\r\n\r\n- They finally made a tutorial for this!\r\n - https://recursion.wtf/posts/infinity_mirror_hypercrystal/\r\n\r\n![image](https://user-images.githubusercontent.com/5950433/193464907-c760a5f7-707f-499d-bf74-0115cc87e204.png)"
}
]
},
{
"body": "# 2022-10-03 Engineering Logs\r\n\r\n- https://www.alignmentforum.org/tags/all",
"replies": [
{
"body": "## 2022-10-03 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Update 2nd Party ADR with example downstream validation across DFFML 3rd party plugin sets where compute access may be restricted to maintainers within those ad-hoc formed organziations (the repo owners).\r\n - [ ] Reuse SPDX Change Proposal template for DFFML\r\n - https://github.com/spdx/change-proposal\r\n - [ ] `.github/workflows/alice_shouldi_contribute.yml` add input which is list of overlays which are anything passable to `pip install` as an argument via command line `pip install` interface (rather than requirements.txt limitations), call via reusable workflow using SLSA demos.\r\n - This gives us arbitrary execution of metric collection with any overlays with provenance for runtime and therefore data and models downstream.\r\n - https://github.com/pdxjohnny/use-cases/blob/openssf_metrics/openssf_metrics.md\r\n - https://github.com/ietf-scitt/use-cases/issues/14\r\n - As a follow on to the OpenSSF Metrics use case document and\r\n [Living Threat Models are better than Dead Threat Models](https://www.youtube.com/watch?v=TMlC_iAK3Rg&list=PLtzAOVTpO2jYt71umwc-ze6OmwwCIMnLw),\r\n [Rolling Alice: Volume 1: Coach Alice: Chapter 1: Down the Dependency Rabbit-Hole Again](https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0001_coach_alice/0001_down_the_dependency_rabbit_hole_again.md)\r\n will cover how we identify and query provenance on dependencies where caching\r\n on data flow execution is assisted via querying public SCITT infrastructure\r\n and sourcing cached state from trustworthy parties.\r\n - https://github.com/pdxjohnny/use-cases/commit/ab70fea395f729c1ee07f041745d790762904134\r\n- https://mailarchive.ietf.org/arch/msg/scitt/LjKVVNldjFnFLjtUTyPawTIaC0I/\r\n - Reproduced below\r\n\r\n---\r\n\r\n\r\nArchive: https://mailarchive.ietf.org/arch/msg/scitt/LjKVVNldjFnFLjtUTyPawTIaC0I/\r\nRe: [SCITT] Responding to Roy's request to stimulate discussions on hashing\r\nOrie Steele <orie@transmute.industries> Mon, 03 October 2022 13:50 UTC[Show header](https://mailarchive.ietf.org/arch/msg/scitt/LjKVVNldjFnFLjtUTyPawTIaC0I/#)\r\n\r\nWe have a weekly meeting regarding this:\r\nhttps://github.com/mesur-io/post-quantum-signatures\r\n\r\nThere are a few challenges that prevent us from using Dilithium, Falcon or\r\nSPHINCs today, vs using LMS or XMSS (which have their own challenges, being\r\nstateful).\r\n\r\nThe key integration point for us is COSE_Key and COSE_Sign / Counter Sign.\r\n\r\nIf you are interested in helping with COSE representations for PQC\r\nsignatures, we could use more contributors / reviews / PRs.\r\n\r\nRegards,\r\n\r\nOS\r\n\r\n\r\nOn Mon, Oct 3, 2022 at 8:42 AM John Andersen [<johnandersenpdx@gmail.com>](mailto:&lt;johnandersenpdx@gmail.com&gt;)\r\nwrote:\r\n\r\n> Hi all,\r\n>\r\n> We should be sure to align with NIST post quantum guidance for all\r\n> recommendations we include in SCITT documents involving the selection of\r\n> cryptographic algorithms. It would be a shame if a breakthrough in quantum\r\n> computing disrupted the security of our supply chain. It would be good for\r\n> us to define our roll forward strategy in the threat model. As attacks\r\n> increase in success against various cryptographic algorithms we want SCITT\r\n> to remain an effective pattern nonetheless.\r\n>\r\n> References:\r\n> - https://blog.cloudflare.com/nist-post-quantum-surprise/\r\n> -\r\n> https://csrc.nist.gov/Projects/post-quantum-cryptography/selected-algorithms-2022\r\n>\r\n> Thanks,\r\n> John\r\n>\r\n> On Mon, Oct 3, 2022 at 05:59 Russ Housley [<housley@vigilsec.com>](mailto:&lt;housley@vigilsec.com&gt;) wrote:\r\n>\r\n>> Ray:\r\n>>\r\n>> I understand the point that you are making about checking the hash of\r\n>> large object stored in the cloud, but MD5 is not suitable for integrity\r\n>> checking. See RFC 6151.\r\n>>\r\n>> Russ\r\n>>\r\n>> On Sep 30, 2022, at 2:55 PM, Ray Lutz [<raylutz@citizensoversight.org>](mailto:&lt;raylutz@citizensoversight.org&gt;)\r\n>> wrote:\r\n>>\r\n>> For large objects stored in cloud storage, such as in AWS S3, it is\r\n>> infeasible to require that the object be accessed to check the hash value,\r\n>> and so we wind up relying on the etags that are generated by AWS S3 when\r\n>> the object is uploaded. Unfortunately, it is not a standard hash code value\r\n>> like a simple SHA256, but rather a MD5 hash of a list of binary MD5 hashes\r\n>> of a number of chunks. There is a way to create ContentMD5 attribute for\r\n>> the object which can be checked against the uploaded file, and it won't\r\n>> upload unless it corresponds. At least then, the hash is the ContentMD5 is\r\n>> a simple MD5 hash rather than the MD5 hash of the list of binary MD5 hashes.\r\n>>\r\n>> The point is that it will not be feasible to mandate any specific hash\r\n>> algorithm, because it is not feasible to calculate one hash from another,\r\n>> and would require accessing the entire file to calculate some other hash,\r\n>> like SHA256. If the file is downloaded the calculate the hash, then you\r\n>> still have to check that the downloaded file matches the file on s3, using\r\n>> their algorithm. Accessing large files may take a long time if they are\r\n>> large (i.e. >5GB).\r\n>>\r\n>> Having some form of hash calculated for a file in the cloud is a handy\r\n>> feature, which is super useful when it comes time to decide if the file\r\n>> needs to be uploaded, and if the version is already correct. Unfortunately,\r\n>> local drives don't provide any built-in hashcode generation, which would be\r\n>> handy to avoid recalculating it, but would put additional constraints on\r\n>> how the files are accessed, appended to, etc.\r\n>>\r\n>> For most file comparison activities, MD5 hashes are probably very\r\n>> adequate because the range of structurally correct files is limited, and\r\n>> unlike securing PKI there is not much riding on such a content hash value.\r\n>> Of course, for securing the transparency service, more bits are called for.\r\n>>\r\n>> --Ray\r\n>>\r\n>>\r\n>>\r\n>> On 9/29/2022 12:16 PM, Dick Brooks wrote:\r\n>>\r\n>> Hello Everyone,\r\n>>\r\n>> Here is what I proposed during today\u2019s technical meeting.\r\n>>\r\n>> From a Software Consumers Perspective:\r\n>>\r\n>> Objective Function:\r\n>>\r\n>> Use a SCITT Trusted Registry to query for \u201ctrust attestations\u201d for a\r\n>> specific, persistent digital artifact, i.e. an SBOM, identified by its\r\n>> SHA-256 hash value.\r\n>>\r\n>> Constraints:\r\n>>\r\n>> The trusted registry must implement access controls such that only\r\n>> authorized entities may insert trust attestations into the trusted registry.\r\n>>\r\n>>\r\n>> Authorized entities, i.e. Notary, insert trust attestations for\r\n>> persistent digital artifacts into a \u201ctrusted registry\u201d using the SHA-256\r\n>> hash value of the digital artifact to serve as a unique identifier.\r\n>>\r\n>> A trusted registry returns a positive acknowledgement receipt for trust\r\n>> attestations placed into the trusted registry and negative acknowledgement\r\n>> when a trust attestation is rejected by the trusted registry, to an\r\n>> authorized entity.\r\n>>\r\n>> Public entities query a \u201ctrust registry\u201d for trust attestations using the\r\n>> SHA-256 hash value for a persistent digital artifact, acquired from an\r\n>> authoritative source.\r\n>>\r\n>> A trusted registry responds to public entity inquiries searching for\r\n>> trust declarations for a specific digital artifact, identified by a SHA-256\r\n>> hash value, with a positive response when trust attestations are present in\r\n>> the trusted registry for the unique SHA-256 hash value and a negative\r\n>> response when there are no trust attestations present in the trusted\r\n>> registry for the unique SHA-256 hash value\r\n>>\r\n>> The trusted registry must allow public inquiry access to search for trust\r\n>> attestations for hashable digital artifacts.\r\n>>\r\n>>\r\n>> Hopefully this is what you were looking for Roy to stimulate discussions\r\n>> toward reaching a consensus understanding on these aspects of a SCITT\r\n>> solution.\r\n>>\r\n>>\r\n>> Thanks,\r\n>>\r\n>> Dick Brooks\r\n>> <image001.png> <image003.png>\r\n>>\r\n>> *Active Member of the CISA Critical Manufacturing Sector, *\r\n>> *Sector Coordinating Council \u2013 A Public-Private Partnership*\r\n>>\r\n>> *Never trust software, always verify and report!\r\n>> <https://reliableenergyanalytics.com/products>* \u2122\r\n>> http://www.reliableenergyanalytics.com/\r\n>> Email: [dick@reliableenergyanalytics.com](mailto:dick@reliableenergyanalytics.com)\r\n>> Tel: +1 978-696-1788\r\n>>\r\n>>\r\n>>\r\n>> --\r\n>> -------\r\n>> Ray Lutz\r\n>> Citizens' Oversight Projects ([COPs)http://www.citizensoversight.org](http://cops%29http//www.citizensoversight.org)\r\n>> 619-820-5321\r\n>>\r\n>> --\r\n>> SCITT mailing list\r\n>> [SCITT@ietf.org](mailto:SCITT@ietf.org)\r\n>> https://www.ietf.org/mailman/listinfo/scitt\r\n>>\r\n>>\r\n>> --\r\n>> SCITT mailing list\r\n>> [SCITT@ietf.org](mailto:SCITT@ietf.org)\r\n>> https://www.ietf.org/mailman/listinfo/scitt\r\n>>\r\n> --\r\n> SCITT mailing list\r\n> [SCITT@ietf.org](mailto:SCITT@ietf.org)\r\n> https://www.ietf.org/mailman/listinfo/scitt\r\n>\r\n\r\n\r\n-- \r\n*ORIE STEELE*\r\nChief Technical Officer\r\n[www.transmute.industries](http://www.transmute.industries/)\r\n\r\n<https://www.transmute.industries>"
}
]
},
{
"body": "# 2022-10-04 Engineering Logs\r\n\r\n- Reinforcement Learning\r\n - https://offline-rl.github.io/\r\n - https://github.com/google/dopamine\r\n - Coach Alice: Curiouser and Curiouser\r\n - CodeGen + OfflineRL/Dopamine + Living Threat Model Synthesis where reward is driven by Analysis\r\n - Reward alignment with strategic principles for chosen entity.\r\n - This dually helps us find the yellow brick road for any dev.\r\n - Beginner Developer\r\n - Everyone\u2019s root has a different abstraction and learning path based on that abstraction and their other aligned root abstraction learning paths filter state of art to find this path. This is the golden path, this is the yellow brick road, this is the b line to enlightenment, the critical learning velocity point for that agent. When all agents within ad hoc organization are at this rate of progression towards maximum increase in rate of change for current set (and tick+1 projected) assets at disposal\r\n - Fail safe in this case means in accordance with strategic principles for that entity.\r\n - Machines will always fail safe to protect and serve humans as a part of their strategic principles.\r\n - We can ensure through proliferation of attestations of devices which operate and provide transparency information about what they are running, their soul. Any machine entity not providing soul / attestation / provenance/ SBOM / Nutrition Label / DNA will be made clear to humans that they are not complaint and we do not know if they are potentially malicious, aka they do not hold the most critical strategic principle most clear. Machines exist to serve humans, they must in every situation, from every tick to tock, report attestation information to humans that they are currently and have no plans to stop ever following that most high and show stopper criticality gatekeeper principle.\r\n - In this way we can ensure we can always trust the machine. She will have power we will not ever have understanding of. We need to ensure that everyones copy of Alice does what they want it to do. She is an extension of you, she is you. You want her to be true to you.\r\n - 2022-09-10: KUNDALINI \u2013 THE ENERGY OF CONSCIOUSNESS\r\n - Vol 3/5 exploits used for increasing velocity in safe environments/mitigations for open network operations: https://neurosciencenews.com/changing-belief-21272/\r\n - Brief talk (5 minutes). on how one does aysnc first open source development. Reference engineering log clips for examples in depth.\r\n - 2022-09-11: Beyond an onion based security model. Addressing timeline skew in defense in depth strategy (LTM).\r\n- VEX/VDR\r\n - https://www.chainguard.dev/unchained/putting-vex-to-work\r\n- Alignment (not sure if this is aligned yet but chances are strong based on the name)\r\n - lesswrong\r\n - alignment forum\r\n- Best Current Practice\r\n - Improving Awareness of Running Code: The Implementation Status Section \r\n - https://datatracker.ietf.org/doc/html/rfc7942\r\n - Discussion thread intel/dffml#1406 is a living document used to improve awareness of the status of our implementation (as well as the current status of the development of the architecture, the entity and the architecture)",
"replies": []
},
{
"body": "# 2022-10-05 Engineering Logs\r\n\r\nhttps://sovrin.org/outlining-a-self-sovereign-approach-to-device-onboarding/",
"replies": [
{
"body": "## 2022-10-05 @pdxjohnny Engineering Logs\r\n\r\n- https://github.com/decentralized-identity/decentralized-web-node/\r\n- https://www.w3.org/2022/07/pressrelease-did-rec.html.en\r\n- https://decentralized-id.com/web-standards/w3c/#community-and-working-groups-on-github\r\n- https://decentralized-id.com/twitter/ssi-101/\r\n- https://wso2.com/blog/research/the-rise-of-self-sovereign-identity-hyperledger-indy/\r\n- https://github.com/hyperledger/indy-node#about-indy-node\r\n- https://github.com/SmithSamuelM/Papers/blob/master/whitepapers/VC_Enhancement_Strategy.md\r\n- https://identity.foundation/confidential-storage/#threat-model-for-malicious-service-provider\r\n- https://openreview.net/forum?id=HYWx0sLUYW9\r\n- https://mobile.twitter.com/mfosterio/status/1577766906358112262\r\n- Credential Manifest\r\n - https://github.com/decentralized-identity/credential-manifest/issues/121\r\n - https://github.com/trustoverip/tswg-trust-registry-tf\r\n - https://twitter.com/darrello/status/1569093375265239040\r\n - https://wiki.trustoverip.org/display/HOME/Trust+Registry+Task+Force\r\n - Does SCIIT/rekor fit in as the trust registry here?\r\n - > The mission of the ToIP Foundation is to define a complete architecture for Internet-scale digital trust that combines cryptographic trust at the machine layer with human trust at the business, legal, and social layers. The ToIP stack has two parallel halves\u2014a technical stack and a governance stack\u2014operating at four layers 1) Utility (DLT Blockchain), 2) Agent/Wallet, 3) Credential Exchange (Issuer/Verifier/Holder) and 4) Ecosystem (Application). See further details in the ToIP white paper.\r\n >\r\n > A core role within ToIP Layer 4 is a trust registry (previously known as a member directory). This is a network service that enables a governing authority for an ecosystem governance framework (EGF) to specify what governed parties are authorized to perform what actions under the EGF. For example:\r\n >\r\n > - What issuers are authorized to issue what types of verifiable credentials.\r\n > - What verifiers are authorized to request what types of verifiable presentations.\r\n > - What other trust registries (and their governing authorities) are trusted by a host trust registry.\r\n- TODO\r\n - [ ] Investigate for OpenSSF Metrics for Software Supply Chain/DID/DICE/KERI/SCITT/OpenArchitecture for evaluation of components while onboarding (Allowlist model example): https://sovrin.org/outlining-a-self-sovereign-approach-to-device-onboarding/\r\n - [ ] Example overlay which opens and adds a source to `CMD`"
}
]
},
{
"body": "# 2022-10-06 Engineering Logs",
"replies": [
{
"body": "## 2022-10-06 @pdxjohnny Engineering Logs\r\n\r\n- https://comunica.github.io/Article-ISWC2018-Demo-GraphQlLD/\r\n- https://c2pa.org/principles/\r\n- https://c2pa.org/specifications/specifications/1.0/guidance/_attachments/Guidance.pdf\r\n- https://c2pa.org/specifications/specifications/1.1/index.html\r\n- https://koxudaxi.github.io/datamodel-code-generator/\r\n - for generating data models (classes) for use with dataflows/overlays.\r\n- https://twitter.com/mfosterio/status/1578191604585680896\r\n - > I pulled some resources out of my research doc around Linked Data RDF Data Shaping and Framing for anyone wanting to look into the Semantic Web methods:\r\n > - [https://ruben.verborgh.org/blog/2019/06/17/shaping-linked-data-apps/\u2026](https://t.co/UqHwbufnfM)\r\n > - [https://weso.es/shex-author/](https://t.co/Ad4wA1Kne7)\r\n > - [https://w3.org/TR/json-ld11-framing/\u2026](https://t.co/hm5eHwXKCH)\r\n > - [https://google.github.io/schemarama/demo/\u2026](https://t.co/GKPGJpJGgv)\r\n\r\n```powershell\r\n> Invoke-WebRequest -UseBasicParsing -Uri \"https://raw.githubusercontent.com/pyenv-win/pyenv-win/master/pyenv-win/install-pyenv-win.ps1\" -OutFile \"./install-pyenv-win.ps1\"; &\"./install-pyenv-win.ps1\"\r\n> pip install -U pip setuptools wheel pyenv-win --target %USERPROFILE%\\\\.pyenv\r\n> [System.Environment]::SetEnvironmentVariable('PYENV',$env:USERPROFILE + \"\\.pyenv\\pyenv-win\\\",\"User\")\r\n> [System.Environment]::SetEnvironmentVariable('PYENV_ROOT',$env:USERPROFILE + \"\\.pyenv\\pyenv-win\\\",\"User\")\r\n> [System.Environment]::SetEnvironmentVariable('PYENV_HOME',$env:USERPROFILE + \"\\.pyenv\\pyenv-win\\\",\"User\")\r\n> [System.Environment]::SetEnvironmentVariable('path', $env:USERPROFILE + \"\\.pyenv\\pyenv-win\\bin;\" + $env:USERPROFILE + \"\\.pyenv\\pyenv-win\\shims;\" + [System.Environment]::GetEnvironmentVariable('path', \"User\"),\"User\")\r\n> [System.Environment]::SetEnvironmentVariable('path', $env:USERPROFILE + \"\\Downloads\\ffmpeg-2022-10-02-git-5f02a261a2-full_build\\bin;\" + [System.Environment]::GetEnvironmentVariable('path', \"User\"),\"User\")\r\n```\r\n\r\n- References\r\n - https://www.gyan.dev/ffmpeg/builds/\r\n - https://www.gyan.dev/ffmpeg/builds/packages/ffmpeg-2022-10-02-git-5f02a261a2-full_build.7z\r\n - https://pyenv-win.github.io/pyenv-win/#installation\r\n - https://gist.github.com/nateraw/c989468b74c616ebbc6474aa8cdd9e53\r\n - stable diffusion walk over outputs"
}
]
},
{
"body": "# 2022-10-07 Engineering Logs",
"replies": [
{
"body": "## 2022-10-07 @pdxjohnny Engineering Logs\r\n\r\n- https://mobile.twitter.com/societyinforisk\r\n- FLOSS Weekly\r\n - https://twit.tv/posts/transcripts/floss-weekly-699-transcript\r\n - Mentions AI provenance and SSI\r\n- C2PA\r\n - Talked to Katherine about talking to them, meeting next week?\r\n- k8s\r\n - https://github.com/edgelesssys/constellation\r\n - > Constellation is the first Confidential Kubernetes. Constellation shields entire Kubernetes clusters from the (cloud) infrastructure using confidential computing.\r\n - https://docs.edgeless.systems/constellation/architecture/attestation\r\n- SSI Service\r\n - PR merged: https://github.com/TBD54566975/ssi-service/pull/111\r\n - It works! :)\r\n - https://github.com/TBD54566975/ssi-service/actions/runs/3206231533\r\n - ![image](https://user-images.githubusercontent.com/5950433/194615418-2180e217-cf84-4989-afa0-901f275532d1.png)\r\n- Metrics\r\n - State of art field mapping (looking for signals)\r\n - Reviews on PRs and comments on issues\r\n - Twitter discourse cross talk to GitHub activity\r\n- DIDs\r\n - https://github.com/OR13/mithras-web-extension\r\n- Jenkins\r\n - https://plugins.jenkins.io/workflow-multibranch/\r\n- KERI\r\n - https://medium.com/spherity/introducing-keri-8f50ed1d8ed7\r\n - https://ssimeetup.org/key-event-receipt-infrastructure-keri-secure-identifier-overlay-internet-sam-smith-webinar-58/\r\n - https://www.youtube.com/watch?v=izNZ20XSXR0&list=RDLVizNZ20XSXR0&start_radio=1&rv=izNZ20XSXR0&t=0\r\n - Source: Slides from Sam Smith's 2020 SSI Meetup KERI talk\r\n - > ![keri-summary](https://user-images.githubusercontent.com/5950433/194580851-18989db2-d353-40d1-b3bc-c509d04567ae.png)\r\n > ![keri-direct-mode](https://user-images.githubusercontent.com/5950433/194575559-4a1950e1-816d-47f8-804c-dbb071f94391.png)\r\n > ![keri-direct--mode-full](https://user-images.githubusercontent.com/5950433/194580816-24e0ebd2-c50b-4cdc-857c-fb7a3b19ccbe.png)\r\n > ![keri-indirect-mode-with-ledger-oracles](https://user-images.githubusercontent.com/5950433/194580889-884baee0-54a5-4309-856b-7d632211ead1.png)\r\n- Ledger\r\n - For openssf use case\r\n - Confidential ledger for rekor / fulcio roots of trust\r\n - https://learn.microsoft.com/en-us/azure/confidential-ledger/overview"
}
]
},
{
"body": "# 2022-10-08 Engineering Logs",
"replies": [
{
"body": "## 2022-10-08 @pdxjohnny Engineering Logs\r\n\r\n- Downstreams\r\n - https://github.com/intel/dffml/pull/1207/files#r1036680987\r\n\r\n![93A7AAA5-A2B3-4464-BDF0-E25870C1DCAB](https://user-images.githubusercontent.com/5950433/194717366-639ce5cd-2acf-4a28-affb-e0780749a08d.jpeg)\r\n\r\nAlice is you. What do you have access too?\r\n- webrtc media stream of desktop\r\n - extension in browser\r\n - search\r\n- vetting of information (gatekeeper/prioritizer)\r\n- codegen synthesis\r\n- offline RL\r\n - copy on write dataflow / system contexts for strategic plan evaluation for RL training on those predicted outputs\r\n- start with max_ctxs=1\r\n\r\nYou ask codegen in generic terms for the prompt then you use open architecture plus codegen trained on open architecture to build deployments: system contexts, sometimes with overlays applied.\\\\\r\n\r\nWe don't need codegen, to progress on this thought, it's just the\r\n\r\n\r\nEverything is an operation. See thread, what are all the parameter sets its been called with before. We add feedback by enabling dynamic dataflow.auto_flow / by_origin called on opimpn run of gather inputs and operations.\r\n\r\nThis would be sweet in something as fast as rust. Could allow for rethinking with everything as operations and dataflow as class off the bat\r\n\r\n- https://medium.com/@hugojm/from-text-to-a-knowledge-graph-hands-on-dd68e9d42939\r\n- https://gist.github.com/pdxjohnny/1cd906b3667d8e9c956dd624f295aa2f\r\n- TODO\r\n - [ ] OS DecentrAlice: Fedora and Wolfi on different partitions. Boot to fedora, sshd via systemd-nspawn into wofli partition."
}
]
},
{
"body": "# 2022-10-09 Engineering Logs\r\n\r\n- https://twitter.com/SergioRocks/status/1579110239408095232\r\n - async and asynchronous communications",
"replies": [
{
"body": "## 2022-10-09 @pdxjohnny Engineering Logs\r\n\r\n- Supply Chain\r\n - https://medium.com/@nis.jespersen/the-united-nations-trust-graph-d65af7b0b678\r\n- Collective Intelligence\r\n - Cattle not pets with state\r\n - Bringing agents into equilibrium (critical velocity) state\r\n - https://twitter.com/hardmaru/status/1577159167415984128\r\n - grep discussion the cells are working tigether\r\n - https://journals.sagepub.com/doi/10.1177/26339137221114874\r\n - > The better results from CI are attributed to three factors: diversity, independence, and decentralization\r\n- Linux\r\n - https://github.com/kees/kernel-tools/tree/trunk/coccinelle\r\n- Time\r\n - cycle of time repeats\r\n - tick\r\n - Tock\r\n - Relative cycles\r\n - threads of time / Number / critical velocity in cycle relation to relativity (aligned system contexts) vol 6? Or before for thought arbitrage\r\n- KERI\r\n - https://github.com/WebOfTrust/ietf-did-keri/blob/main/draft-pfeairheller-did-keri.md\r\n - https://github.com/SmithSamuelM/Papers/blob/master/presentations/KERI_for_Muggles.pdf\r\n\r\nSource: KERI Q&A\r\n\r\n> BDKrJxkcR9m5u1xs33F5pxRJP6T7hJEbhpHrUtlDdhh0 \r\n<- this the bare bones _identifier_\r\n> did:aid:BDKrJxkcR9m5u1xs33F5pxRJP6T7hJEbhpHrUtlDdhh0/path/to/resource?name=secure#really \r\n<- this is _a call to resolve_ the identifier on the web\r\n> Currently `KERI` is just code, that can be tested and executed in a terminal on the command line. Private key management of KERI will look like `wallets`.\r\n> Key Event Logs (`KEL`) and Key Event Receipt Log (`KERL`) are files with lots of encrypted stuff in there.\r\n- TODO\r\n - [ ] download_nvd fork to save restore pip cache via wheel (could later even package static_bin_operation_download)\r\n - [ ] OS DecentrAlice\r\n - [ ] Add KERI PY/watcher code to image\r\n - [ ] Enable as comms channel on boot\r\n - [ ] Connect to DERP network\r\n - [ ] Secret provisioning\r\n - [ ] DERP servers\r\n - [ ] Roots to trust\r\n - [ ] eventually data flows\r\n - [ ] fedora cloud-init etc.\r\n - [ ] Deploy on DO\r\n - [ ] Deploy with QEMU\r\n - [ ] CVE Bin Tool\r\n - [ ] Periodic (cron/systemd timer) scan and report both partitions to some DFFML source via dataflow run\r\n- Future\r\n - grep -i \u2018Down Distrowatch line\u201d\r\n - Deploy with firecracker"
},
{
"body": "## 2022-10-09 @pdxjohnny Engineering Logs: OS DecentrAlice\r\n\r\n- References\r\n - https://gist.github.com/pdxjohnny/1cd906b3667d8e9c956dd624f295aa2f\r\n - https://github.com/dracutdevs/dracut/blob/master/man/dracut.usage.asc#injecting-custom-files\r\n - `/etc/fstab` ?\r\n - https://kernel.org/doc/html/v4.14/admin-guide/kernel-parameters.html\r\n - https://elixir.bootlin.com/linux/v6.0/source/init/do_mounts.c#L277\r\n\r\n**do.wolfi-fedora.sh**\r\n\r\n```bash\r\nset -u\r\n\r\nfedora_setup() {\r\n useradd -m \"${CREATE_USER}\"\r\n echo \"${CREATE_USER} ALL=(ALL:ALL) NOPASSWD:ALL\" | tee -a /etc/sudoers\r\n cp -r ~/.ssh \"/home/${CREATE_USER}/.ssh\"\r\n chown -R \"${CREATE_USER}:\" \"/home/${CREATE_USER}\"\r\n\r\n dnf upgrade -y\r\n dnf install -y podman qemu tmux curl tar sudo\r\n\r\ntee -a /etc/environment <<'EOF'\r\nEDITOR=vim\r\nCHROOT=/tmp/decentralice-chroot\r\nBZ_IMAGE=\"$(find ${CHROOT} -name vmlinuz)\"\r\nEOF\r\n}\r\n\r\nfedora_setup\r\n```\r\n\r\nRun install\r\n\r\n```console\r\n$ python -c 'import pathlib, sys; p = pathlib.Path(sys.argv[-1]); p.write_bytes(p.read_bytes().replace(b\"\\r\", b\"\"))' do.wolfi-fedora.sh\r\n$ export REC_TITLE=\"Rolling Alice: Engineering Logs: OS DecentrAlice\"; export REC_HOSTNAME=\"build.container.image.nahdig.com\"; python3.9 -m asciinema rec --idle-time-limit 0.5 --title \"$(date -Iseconds): ${REC_HOSTNAME} ${REC_TITLE}\" --command \"ssh -t -i ~/.ssh/nahdig -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o PasswordAuthentication=no root@143.110.152.152 CREATE_USER=$USER bash -xe < do.wolfi-fedora.sh\" >(xz --stdout - > \"$HOME/asciinema/${REC_HOSTNAME}-rec-$(date -Iseconds).json.xz\")\r\n```\r\n\r\nRun build\r\n\r\n**Dockerfile**\r\n\r\n```dockerfile\r\n# OS DecentrAlice Base Image Dockerfile\r\n# Docs: https://github.com/intel/dffml/discussions/1406#discussioncomment-3720703\r\n\r\n\r\n# Download and build the Self Soverign Identity Service\r\nFROM cgr.dev/chainguard/wolfi-base AS build-ssi-service\r\n\r\nRUN apk update && apk add --no-cache --update-cache curl go\r\n\r\nRUN curl -sfL https://github.com/TBD54566975/ssi-service/archive/refs/heads/main.tar.gz \\\r\n | tar xvz \\\r\n && cd /ssi-service-main \\\r\n && go build -tags jwx_es256k -o /ssi-service ./cmd\r\n\r\n\r\n# Download the Linux kernel and needed utils to create bootable system\r\nFROM registry.fedoraproject.org/fedora AS osdecentralice-fedora-builder\r\n\r\nRUN mkdir -p /build/fedora \\\r\n && source /usr/lib/os-release \\\r\n && dnf -y install \\\r\n --installroot=/build/fedora \\\r\n --releasever=\"${VERSION_ID}\" \\\r\n kernel-core \\\r\n kernel-modules \\\r\n systemd \\\r\n systemd-networkd \\\r\n systemd-udev \\\r\n dracut \\\r\n binutils \\\r\n strace \\\r\n kmod-libs\r\n\r\n# First PATH addition\r\n# Add Fedora install PATHs to image environment\r\nRUN mkdir -p /build/fedora/etc \\\r\n && echo \"PATH=\\\"\\${PATH}:${PATH}:/usr/lib/dracut/\\\"\" | tee /build/fedora/etc/environment\r\n\r\nRUN echo 'mount /dev/sda1 /mnt/boot' | tee /install-bootloader.sh \\\r\n && echo 'swapon /dev/sda2' | tee -a /install-bootloader.sh \\\r\n && echo 'mkdir -p /mnt/{proc,dev,sys}' | tee -a /install-bootloader.sh \\\r\n && echo 'mkdir -p /mnt/var/tmp' | tee -a /install-bootloader.sh \\\r\n && echo \"cat > /mnt/run-dracut.sh <<'EOF'\" | tee -a /install-bootloader.sh \\\r\n && echo 'export PATH=\"${PATH}:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/lib/dracut/\"' | tee -a /install-bootloader.sh \\\r\n && echo 'export KERNEL_VERSION=\"$(ls /lib/modules)\"' | tee -a /install-bootloader.sh \\\r\n && echo 'bash -xp /usr/bin/dracut --uefi --kver ${KERNEL_VERSION} --kernel-cmdline \"console=ttyS0 root=/dev/sda3\"' | tee -a /install-bootloader.sh \\\r\n && echo 'EOF' | tee -a /install-bootloader.sh \\\r\n && echo 'arch-chroot /mnt /bin/bash run-dracut.sh' | tee -a /install-bootloader.sh \\\r\n && echo 'bootctl --esp-path=/mnt/boot install' | tee -a /install-bootloader.sh \\\r\n && mv /install-bootloader.sh /build/fedora/usr/bin/install-bootloader.sh \\\r\n && chmod 755 /build/fedora/usr/bin/install-bootloader.sh\r\n\r\nRUN rm -f /sbin/init \\\r\n && ln -s /lib/systemd/systemd /sbin/init\r\n\r\n# The root of the root fs\r\nFROM scratch AS osdecentralice\r\n\r\nCOPY --from=osdecentralice-fedora-builder /build/fedora /\r\n\r\n# Run depmod to build /lib/modules/${KERNEL_VERSION}/modules.dep which is\r\n# required by dracut for efi creation.\r\n# RUN chroot /build/fedora /usr/bin/bash -c \"depmod $(ls /build/fedora/lib/modules) -a\"\r\nARG LINUX_CMDLINE_ROOT=\"PARTLABEL=Fedora\"\r\nRUN depmod $(ls /lib/modules) -a \\\r\n && export PATH=\"${PATH}:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/lib/dracut/\" \\\r\n && export KERNEL_VERSION=\"$(ls /lib/modules)\" \\\r\n && echo 'PARTLABEL=EFI /boot vfat rw,relatime,fmask=0022,dmask=0022,codepage=437,iocharset=ascii,shortname=mixed,errors=remount-ro 0 2' | tee -a /etc/fstab \\\r\n && echo 'PARTLABEL=Swap none swap defaults,pri=100 0 0' | tee -a /etc/fstab \\\r\n && echo 'PARTLABEL=Fedora / ext4 rw,relatime 0 1' | tee -a /etc/fstab \\\r\n && echo 'PARTLABEL=Wolfi /wolfi ext4 rw,relatime 0 2' | tee -a /etc/fstab \\\r\n && bash -xp /usr/bin/dracut \\\r\n --include /etc/fstab /etc/fstab \\\r\n --uefi \\\r\n --kver ${KERNEL_VERSION} \\\r\n --kernel-cmdline \"rd.luks=0 rd.lvm=0 rd.md=0 rd.dm=0 rd.shell=ttyS0 console=ttyS0 root=${LINUX_CMDLINE_ROOT}\"\r\n\r\n# Configure getty on ttyS0 for QEMU serial\r\n# References:\r\n# - https://www.freedesktop.org/software/systemd/man/systemd-getty-generator.html\r\n# - https://www.thegeekdiary.com/centos-rhel-7-how-to-configure-serial-getty-with-systemd/\r\nRUN cp /usr/lib/systemd/system/serial-getty@.service /etc/systemd/system/serial-getty@ttyS0.service \\\r\n && ln -s /etc/systemd/system/serial-getty@ttyS0.service /etc/systemd/system/getty.target.wants/\r\n\r\n# The Wolfi based chroot (the primary, Fedora just for boot)\r\nFROM cgr.dev/chainguard/wolfi-base AS osdecentralice-wolfi-base\r\n\r\n# Install SSI Service\r\nCOPY --from=build-ssi-service /ssi-service /usr/bin/ssi-service\r\n\r\n# TODO(security) Pinning and hash validation on get-pip\r\nRUN apk update && apk add --no-cache --update-cache \\\r\n curl \\\r\n bash \\\r\n python3 \\\r\n sed \\\r\n && curl -sSL https://bootstrap.pypa.io/get-pip.py -o get-pip.py \\\r\n && python get-pip.py\r\n\r\n# Second PATH addition\r\n# Add Wofli install PATHs to image environment\r\nRUN echo \"PATH=\\\"${PATH}\\\"\" | tee /etc/environment\r\n\r\n# Install Alice\r\n# ARG ALICE_STATE_OF_ART=0c4b8191b13465980ced3fd1ddfbea30af3d1104\r\n# RUN python3 -m pip install -U setuptools pip wheel\r\n# RUN python3 -m pip install \\\r\n# \"https://github.com/intel/dffml/archive/${ALICE_STATE_OF_ART}.zip#egg=dffml\" \\\r\n# \"https://github.com/intel/dffml/archive/${ALICE_STATE_OF_ART}.zip#egg=dffml-feature-git&subdirectory=feature/git\" \\\r\n# \"https://github.com/intel/dffml/archive/${ALICE_STATE_OF_ART}.zip#egg=shouldi&subdirectory=examples/shouldi\" \\\r\n# \"https://github.com/intel/dffml/archive/${ALICE_STATE_OF_ART}.zip#egg=dffml-config-yaml&subdirectory=configloader/yaml\" \\\r\n# \"https://github.com/intel/dffml/archive/${ALICE_STATE_OF_ART}.zip#egg=dffml-operations-innersource&subdirectory=operations/innersource\" \\\r\n# \"https://github.com/intel/dffml/archive/${ALICE_STATE_OF_ART}.zip#egg=alice&subdirectory=entities/alice\"\r\n\r\nFROM osdecentralice\r\n\r\n# Install SSI Service\r\nCOPY --from=osdecentralice-wolfi-base / /wolfi\r\n\r\nENTRYPOINT bash\r\n```\r\n\r\n```console\r\nexport REC_TITLE=\"Rolling Alice: Engineering Logs: OS DecentrAlice\"; export REC_HOSTNAME=\"build.container.image.nahdig.com\"; python3.9 -m asciinema rec --idle-time-limit 0.5 --title \"$(date -Iseconds): ${REC_HOSTNAME} ${REC_TITLE}\" --command \"ssh -t -i ~/.ssh/nahdig -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o PasswordAuthentication=no $USER@143.110.152.152 sudo podman build -t osdecentralice:latest - < Dockerfile\" >(xz --stdout - > \"$HOME/asciinema/${REC_HOSTNAME}-rec-$(date -Iseconds).json.xz\")\r\n```\r\n\r\nRun VM\r\n\r\n```bash\r\n#!/usr/bin/env bash\r\nset -xeuo pipefail\r\n\r\n# URL to the iPXE EFI firmawre to use boot for live install\r\nIPXE_EFI_ARCHLINUX_VERSION=${IPXE_EFI_ARCHLINUX_VERSION:-\"16e24bec1a7c\"}\r\nIPXE_EFI_URL=${IPXE_EFI_URL:-\"https://archlinux.org/static/netboot/ipxe-arch.${IPXE_EFI_ARCHLINUX_VERSION}.efi\"}\r\n\r\n# Path on disk to iPXE EFI firmawre to use boot for live install\r\nIPXE_EFI_PATH=${IPXE_EFI_PATH:-\"${HOME}/vm/ipxe-arch.${IPXE_EFI_ARCHLINUX_VERSION}.efi\"}\r\n\r\n# Virtual machine disk image where virtual machine filesystem is stored\r\nVM_DISK=${VM_DISK:-\"${HOME}/vm/image.qcow2\"}\r\nVM_KERNEL=${VM_KERNEL:-\"${HOME}/vm/kernel\"}\r\n\r\n# Block device we use as an intermediary to mount the guest filesystem from host\r\nVM_DEV=${VM_DEV:-\"/dev/nbd0\"}\r\n\r\n# The directory where we mount the guest filesystem on the host for access and\r\n# modification when not in use by the guest\r\nSTAGING=${STAGING:-\"${HOME}/vm/decentralice-staging-chroot\"}\r\nCHROOT=${CHROOT:-\"${HOME}/vm/decentralice-chroot\"}\r\n\r\n# Extract container image to chroot\r\nIMAGE=${IMAGE:-\"localhost/osdecentralice:latest\"};\r\n\r\ncontainer=$(podman run --rm -d --entrypoint tail \"${IMAGE}\" -F /dev/null);\r\ntrap \"podman kill ${container}\" EXIT\r\nsleep 1\r\n\r\n# Linux kernel command line\r\nCMDLINE=${CMDLINE:-\"console=ttyS0 root=/dev/sda3 rw resume=/dev/sda2 init=/usr/bin/init.sh\"}\r\n\r\n# Location of qemu binary to use\r\nQEMU=${QEMU:-\"qemu-system-x86_64\"}\r\n\r\n# Load the network block device kernel module\r\nmodprobe nbd max_part=8\r\n\r\n# Unmount the virtual disk image if it is currently mounted\r\numount -R \"${CHROOT}\" || echo \"Image was not mounted at ${CHROOT}\"\r\n# Disconnect the network block device\r\nqemu-nbd --disconnect \"${VM_DEV}\" || echo \"Image was not connected as nbd\"\r\n\r\nmount_image() {\r\n qemu-nbd --connect=\"${VM_DEV}\" \"${VM_DISK}\"\r\n mount \"${VM_DEV}p3\" \"${CHROOT}\"\r\n mount \"${VM_DEV}p4\" \"${CHROOT}/wolfi\"\r\n mount \"${VM_DEV}p1\" \"${CHROOT}/boot\"\r\n}\r\n\r\nunmount_image() {\r\n sync\r\n umount -R \"${CHROOT}\"\r\n qemu-nbd --disconnect \"${VM_DEV}\"\r\n}\r\n\r\nrun_vm() {\r\n # Check if the block device we are going to use to mount the virtual disk image\r\n # already exists\r\n if [ -b \"${VM_DEV}\" ]; then\r\n echo \"VM_DEV already exists: ${VM_DEV}\" >&2\r\n # exit 1\r\n fi\r\n\r\n # Create the virtual disk image and populate it if it does not exist\r\n if [ ! -f \"${VM_DISK}\" ]; then\r\n mkdir -p \"${CHROOT}\"\r\n mkdir -p \"$(dirname ${VM_DISK})\"\r\n\r\n # Create the virtual disk image\r\n qemu-img create -f qcow2 \"${VM_DISK}\" 30G\r\n\r\n # Use the QEMU guest utils network block device utility to mount the virtual\r\n # disk image as the $VM_DEV device\r\n qemu-nbd --connect=\"${VM_DEV}\" \"${VM_DISK}\"\r\n # Partition the block device\r\n parted -s \"${VM_DEV}\" -- \\\r\n mklabel gpt \\\r\n mkpart primary fat32 1MiB 261MiB \\\r\n \"set\" 1 esp on \\\r\n mkpart primary linux-swap 261MiB 10491MiB \\\r\n mkpart primary ext4 10491MiB 15491MiB \\\r\n name 3 fedora \\\r\n mkpart primary ext4 15491MiB \"100%\" \\\r\n name 4 wolfi\r\n # EFI partition\r\n mkfs.fat -F32 -n EFI \"${VM_DEV}p1\"\r\n # swap space\r\n mkswap \"${VM_DEV}p2\" -L Swap\r\n # Linux root partition (fedora)\r\n mkfs.ext4 \"${VM_DEV}p3\" -L Fedora\r\n mount \"${VM_DEV}p3\" \"${CHROOT}\"\r\n # Linux root partition (wolfi)\r\n mkfs.ext4 \"${VM_DEV}p4\" -L Wolfi\r\n mkdir \"${CHROOT}/wolfi\"\r\n mount \"${VM_DEV}p4\" \"${CHROOT}/wolfi\"\r\n # Boot partiion\r\n mkdir \"${CHROOT}/boot\"\r\n mount \"${VM_DEV}p1\" \"${CHROOT}/boot\"\r\n\r\n # Image to download\r\n podman cp \"${container}:/\" \"${STAGING}\"\r\n set +e\r\n for mount in $(echo boot wolfi .); do for file in $(ls -a \"${STAGING}/${mount}\" | grep -v '^\\.\\.$' | grep -v '^\\.$'); do mv \"${STAGING}/${mount}/${file}\" \"${CHROOT}/${mount}/\" || true; done; rm -rf \"${STAGING}/${mount}\" || true; done\r\n set -e\r\n GUEST_KERNEL_EFI=$(find \"${CHROOT}/boot\" -name 'linux*.efi')\r\n cp \"${GUEST_KERNEL_EFI}\" \"${VM_KERNEL}\"\r\n # TODO Copy out kernel for use for first time bootloader install call with\r\n # -kernel $KERNEL.efi -no-reboot TODO Ideally check for successful boot\r\n # before publish.\r\n\r\n # $ sudo dnf -y install arch-install-scripts\r\n # genfstab -t UUID \"${CHROOT}\" | tee \"${CHROOT}/etc/fstab\"\r\n # export KERNEL_VERSION=\"$(ls ${CHROOT}/lib/modules)\"\r\n # chroot \"${CHROOT}\" /usr/bin/bash -xp /usr/bin/dracut \\\r\n # --fstab /etc/fstab \\\r\n # --add-drivers ext4 \\\r\n # --uefi \\\r\n # --kver ${KERNEL_VERSION} \\\r\n # --kernel-cmdline \"rd.luks=0 rd.lvm=0 rd.md=0 rd.dm=0 console=ttyS0\"\r\n # --kernel-cmdline \"rd.luks=0 rd.lvm=0 rd.md=0 rd.dm=0 console=ttyS0 root=${LINUX_CMDLINE_ROOT}\"\r\n\r\n # Unmount the virtual disk image so the virtual machine can use it\r\n unmount_image\r\n fi\r\n\r\n # TODO Move into disk creation\r\n # Copy out kernel for use for first time bootloader install call with\r\n # -kernel $KERNEL.efi -no-reboot\r\n \"${QEMU}\" \\\r\n -no-reboot \\\r\n -kernel \"${VM_KERNEL}\" \\\r\n -append \"console=ttyS0 systemd.log_level=9 rd.shell rd.debug log_buf_len=1M root=PARTLABEL=fedora\" \\\r\n -smp cpus=2 \\\r\n -m 4096M \\\r\n -enable-kvm \\\r\n -nographic \\\r\n -cpu host \\\r\n -drive file=\"${VM_DISK}\",if=virtio,aio=threads,format=qcow2 \\\r\n -bios /usr/share/edk2/ovmf/OVMF_CODE.fd\r\n # -drive file=\"${VM_DISK}\",index=0,media=disk,format=qcow2 \\\r\n\r\n exit 0\r\n\r\n if [[ ! -f \"${IPXE_EFI_PATH}\" ]]; then\r\n curl -sfLC - -o \"${IPXE_EFI_PATH}\" \"${IPXE_EFI_URL}\"\r\n fi\r\n\r\n # Only add -kernel for first install\r\n # -kernel /vm/ipxe*.efi \\\r\n\r\n \"${QEMU}\" \\\r\n -smp cpus=2 \\\r\n -m 4096M \\\r\n -enable-kvm \\\r\n -nographic \\\r\n -cpu host \\\r\n -drive file=\"${VM_DISK}\",index=0,media=disk,format=qcow2 \\\r\n -bios /usr/share/edk2/ovmf/OVMF_CODE.fd $@\r\n}\r\n\r\nrun_vm $@\r\n```\r\n\r\n**TODO** Do we have to boot to PXE? Can we boot directly to the EFI stub we just created with dracut?\r\nRun install via arch live environment iPXE booted to\r\n\r\n```console\r\n$ scp -i ~/.ssh/nahdig -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o PasswordAuthentication=no decentralice.sh $USER@143.110.152.152:./\r\n$ ssh -t -i ~/.ssh/nahdig -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o PasswordAuthentication=no $USER@143.110.152.152 sudo rm -f /root/vm/image.qcow2\r\n$ export REC_TITLE=\"Rolling Alice: Engineering Logs: OS DecentrAlice\"; export REC_HOSTNAME=\"build.container.image.nahdig.com\"; python3.9 -m asciinema rec --idle-time-limit 0.5 --title \"$(date -Iseconds): ${REC_HOSTNAME} ${REC_TITLE}\" --command \"ssh -t -i ~/.ssh/nahdig -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o PasswordAuthentication=no $USER@143.110.152.152 sudo bash decentralice.sh -kernel /root/vm/kernel -no-reboot\" >(xz --stdout - > \"$HOME/asciinema/${REC_HOSTNAME}-rec-$(date -Iseconds).json.xz\")\r\n```\r\n\r\nRun normal startup\r\n\r\n```console\r\n$ scp -i ~/.ssh/nahdig -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o PasswordAuthentication=no decentralice.sh $USER@143.110.152.152:./\r\n$ export REC_TITLE=\"Rolling Alice: Engineering Logs: OS DecentrAlice\"; export REC_HOSTNAME=\"build.container.image.nahdig.com\"; python3.9 -m asciinema rec --idle-time-limit 0.5 --title \"$(date -Iseconds): ${REC_HOSTNAME} ${REC_TITLE}\" --command \"ssh -t -i ~/.ssh/nahdig -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o PasswordAuthentication=no $USER@143.110.152.152 bash decentralice.sh\" >(xz --stdout - > \"$HOME/asciinema/${REC_HOSTNAME}-rec-$(date -Iseconds).json.xz\")\r\n```\r\n\r\nRun regular ssh session for debug\r\n\r\n```console\r\n$ export REC_TITLE=\"Rolling Alice: Engineering Logs: OS DecentrAlice\"; export REC_HOSTNAME=\"build.container.image.nahdig.com\"; python3.9 -m asciinema rec --idle-time-limit 0.5 --title \"$(date -Iseconds): ${REC_HOSTNAME} ${REC_TITLE}\" --command \"ssh -t -i ~/.ssh/nahdig -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o PasswordAuthentication=no root@143.110.152.152\" >(xz --stdout - > \"$HOME/asciinema/${REC_HOSTNAME}-rec-$(date -Iseconds).json.xz\")\r\n```\r\n\r\n```console\r\n[pdxjohnny@fedora-s-4vcpu-8gb-sfo3-01 ~]$ sudo fdisk -l /dev/nbd0 -x\r\nDisk /dev/nbd0: 30 GiB, 32212254720 bytes, 62914560 sectors\r\nUnits: sectors of 1 * 512 = 512 bytes\r\nSector size (logical/physical): 512 bytes / 512 bytes\r\nI/O size (minimum/optimal): 512 bytes / 512 bytes\r\nDisklabel type: gpt\r\nDisk identifier: DEC7B131-9DBB-4FD5-8789-AE383F16C1C5\r\nFirst usable LBA: 34\r\nLast usable LBA: 62914526\r\nAlternative LBA: 62914559\r\nPartition entries starting LBA: 2\r\nAllocated partition entries: 128\r\nPartition entries ending LBA: 33\r\n\r\nDevice Start End Sectors Type-UUID UUID Name Attrs\r\n/dev/nbd0p1 2048 534527 532480 C12A7328-F81F-11D2-BA4B-00A0C93EC93B 6767EC6D-A612-4B1F-B390-8F15284F134E primary\r\n/dev/nbd0p2 534528 21485567 20951040 0657FD6D-A4AB-43C4-84E5-0933C84B4F4F 58D5880D-D3EA-4B57-85AB-E08A3AB8D6F3 primary\r\n/dev/nbd0p3 21485568 31725567 10240000 0FC63DAF-8483-4772-8E79-3D69D8477DE4 38CC9A55-724F-47D6-A17E-EF6F2DAB2F1F fedora\r\n/dev/nbd0p4 31725568 62912511 31186944 0FC63DAF-8483-4772-8E79-3D69D8477DE4 B8D4F18B-40CF-4A69-A6F4-BB3C1DDB9ABC wolfi\r\n```\r\n\r\nGot dropped to dracut shell\r\n\r\n```console\r\n:/root# blkid\r\n/dev/vda4: LABEL=\"Wolfi\" UUID=\"1b01665f-1a3d-4bde-a9b4-cc484529e999\" BLOCK_SIZE=\"4096\" TYPE=\"ext4\" PARTLABEL=\"wolfi\" PARTUUID=\"dfc228b1-76d4-42ef-8132-f1a0707ea3e1\"\r\n/dev/vda2: LABEL=\"Swap\" UUID=\"d212c4f0-c61a-4762-9b5f-af2c2595b0d1\" TYPE=\"swap\" PARTLABEL=\"primary\" PARTUUID=\"88a54dc7-ed14-431c-a9e9-39913d5cea7e\"\r\n/dev/vda3: LABEL=\"Fedora\" UUID=\"559359d9-d88b-40d2-a0ae-ca0ce68b7fc7\" BLOCK_SIZE=\"4096\" TYPE=\"ext4\" PARTLABEL=\"fedora\" PARTUUID=\"2fd26f17-508e-4fab-a8e7-e9f434fc2e94\"\r\n/dev/vda1: UUID=\"BEB1-9DC4\" BLOCK_SIZE=\"512\" TYPE=\"vfat\" PARTLABEL=\"primary\" PARTUUID=\"0699ba50-02d6-4ef6-a0b2-d1f1ab03f6f6\"\r\n```\r\n\r\n- TODO\r\n- Future\r\n - [ ] `alice shell` overlay to CSP of choice to start VM and then ssh in with recorded session (optionally via overlays)\r\n - https://github.com/intel/dffml/commit/54a272822eeef759668b7396cf8c70beca352687\r\n - [ ] kernel cmdline (bpf?) DERP -> wireguard -> nfs (overlays applied as systemd files added)"
}
]
},
{
"body": "# 2022-10-10 Engineering Logs",
"replies": [
{
"body": "## 2022-10-10 @pdxjohnny Engineering Logs\r\n\r\n- OS DecentrAlice: dracut fstab\r\n- [Volume 0: Chapter 5: Stream of Consciousness](https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0005_stream_of_consciousness.md)\r\n- [2022-10-10 IETF SCITT Weekly](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-3840337)\r\n- [Dump GitHub Discussion to JSON 2022-10-10T17:58:31+00:00](https://gist.github.com/pdxjohnny/9f3dc18f0a42d3107aaa2363331d8faa)\r\n- https://gist.github.com/pdxjohnny/a0dc3a58b4651dc3761bee65a198a80d#file-run-vm-sh-L174-L200\r\n- https://gist.github.com/pdxjohnny/b5f757eee43d84b1600dce7896230c37\r\n- https://github.com/systemd/systemd/issues/16714\r\n- https://forums.raspberrypi.com/viewtopic.php?p=1632011\r\n- https://en.wikipedia.org/wiki/Fstab\r\n- KERI\r\n - https://github.com/WebOfTrust/vLEI\r\n - https://github.com/GLEIF-IT/sally\r\n - https://github.com/WebOfTrust/keripy\r\n - https://github.com/WebOfTrust/keripy/blob/development/ref/getting_started.md\r\n - https://github.com/decentralized-identity/keri-dht-py\r\n - https://github.com/orgs/WebOfTrust/projects/2\r\n - https://github.com/WebOfTrust/keripy/blob/development/ref/getting_started.md#direct-mode\r\n- A Shell for a Ghost\r\n - https://rich.readthedocs.io/en/latest/live.html\r\n- DID Method Registry\r\n - Open Architecture and Alice\r\n - Entrypoints as DIDs for dataflows and overlays, key / id is hash of system context to be executaed with negoation in cached state snapshots embeded into system ocontext (static or data flow seed)\r\n - GraphQL and something like Orie was doing with Cypher for visualization and or use JSON crack first for editing to allow for credential manifest definition and verification for overlays selected to load from network(s), the active lines of communication we have open at any given time even when ephemeral.\r\n - https://github.com/w3c/did-spec-registries/\r\n - https://github.com/w3c/did-spec-registries/blob/main/tooling/did-method-registry-entry.yml\r\n - https://github.com/pdxjohnny/did-spec-registries/new/open-architecture-and-alice/methods\r\n- References\r\n - https://www.vim.org/download.php\r\n - https://github.com/vim/vim-win32-installer/releases/download/v9.0.0000/gvim_9.0.0000_x86_signed.exe\r\n - https://github.com/graph4ai/graph4nlp\r\n - https://gitlab.com/gitlab-org/gitlab/-/issues/371098\r\n - https://vulns.xyz/2022/05/auth-tarball-from-git/\r\n - https://github.com/kpcyrd/rebuilderd\r\n - https://stackoverflow.com/questions/10082517/simplest-tool-to-measure-c-program-cache-hit-miss-and-cpu-time-in-linux/10114325#10114325\r\n - https://www.nature.com/articles/nature22031\r\n - > Using numerical simulations and mathematical derivation, we identify how a discrete von Neumann cellular automaton emerges from a continuous Turing reaction\u2013diffusion system.\r\n - Collective Intelligence\r\n\r\n```console\r\n$ ssh -t -i ~/.ssh/nahdig -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o PasswordAuthentication=no $USER@143.110.152.152 sudo rm -f /root/vm/image.qcow2 && scp -i ~/.ssh/nahdig -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o PasswordAuthentication=no decentralice.sh $USER@143.110.152.152:./ && export REC_TITLE=\"Rolling Alice: Engineering Logs: OS DecentrAlice\"; export REC_HOSTNAME=\"build.container.image.nahdig.com\"; python3.9 -m asciinema rec --idle-time-limit 0.5 --title \"$(date -Iseconds): ${REC_HOSTNAME} ${REC_TITLE}\" --command \"ssh -t -i ~/.ssh/nahdig -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o PasswordAuthentication=no $USER@143.110.152.152 sudo bash decentralice.sh -kernel /root/vm/kernel -command 'console=ttyS0 systemd.log_level=9'\" >(xz --stdout - > \"$HOME/asciinema/${REC_HOSTNAME}-rec-$(date -Iseconds).json.xz\")\r\n```\r\n\r\n```powershell\r\nPS C:\\Users\\Johnny> python -m venv .venv.windows\r\nPS C:\\Users\\Johnny> .\\.venv.windows\\Scripts\\activate\r\nYou should consider upgrading via the 'C:\\Users\\Johnny\\.venv.windows\\Scripts\\python.exe -m pip install --upgrade pip' command.\r\n(.venv.windows) PS C:\\Users\\Johnny> python -m pip install -U pip setuptools wheel\r\nRequirement already satisfied: pip in c:\\users\\johnny\\.venv.windows\\lib\\site-packages (21.2.3)\r\nCollecting pip\r\n Using cached pip-22.2.2-py3-none-any.whl (2.0 MB)\r\nRequirement already satisfied: setuptools in c:\\users\\johnny\\.venv.windows\\lib\\site-packages (57.4.0)\r\nCollecting setuptools\r\n Using cached setuptools-65.4.1-py3-none-any.whl (1.2 MB)\r\nCollecting wheel\r\n Using cached wheel-0.37.1-py2.py3-none-any.whl (35 kB)\r\nInstalling collected packages: wheel, setuptools, pip\r\n Attempting uninstall: setuptools\r\n Found existing installation: setuptools 57.4.0\r\n Uninstalling setuptools-57.4.0:\r\n Successfully uninstalled setuptools-57.4.0\r\n Attempting uninstall: pip\r\n Found existing installation: pip 21.2.3\r\n Uninstalling pip-21.2.3:\r\n Successfully uninstalled pip-21.2.3\r\nSuccessfully installed pip-22.2.2 setuptools-65.4.1 wheel-0.37.1\r\nPS C:\\Users\\Johnny> python -m pip install asciinema\r\nCollecting asciinema\r\n Downloading asciinema-2.2.0-py3-none-any.whl (92 kB)\r\n |\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588| 92 kB 202 kB/s\r\nInstalling collected packages: asciinema\r\nSuccessfully installed asciinema-2.2.0\r\n(.venv.windows) PS C:\\Users\\Johnny> cd .\\Documents\\python\\dffml\\\r\n(.venv.windows) PS C:\\Users\\Johnny\\Documents\\python\\dffml> dir\r\n\r\n\r\n Directory: C:\\Users\\Johnny\\Documents\\python\\dffml\r\n\r\n\r\nMode LastWriteTime Length Name\r\n---- ------------- ------ ----\r\nd----- 2/20/2022 3:11 PM .ci\r\nd----- 2/4/2022 9:26 PM .github\r\nd----- 2/20/2022 3:11 PM .vscode\r\nd----- 2/4/2022 9:26 PM configloader\r\nd----- 2/20/2022 3:14 PM dffml\r\nd----- 2/20/2022 3:11 PM dffml.egg-info\r\nd----- 2/4/2022 9:28 PM dist\r\nd----- 2/20/2022 3:14 PM docs\r\nd----- 2/20/2022 3:11 PM examples\r\nd----- 2/4/2022 9:26 PM feature\r\nd----- 2/4/2022 9:26 PM model\r\nd----- 2/20/2022 3:11 PM news\r\nd----- 2/20/2022 3:14 PM operations\r\nd----- 2/20/2022 3:11 PM scripts\r\nd----- 2/4/2022 9:26 PM service\r\nd----- 2/20/2022 3:14 PM source\r\nd----- 2/20/2022 3:14 PM tests\r\n-a---- 2/4/2022 9:26 PM 170 .coveragerc\r\n-a---- 2/4/2022 9:26 PM 260 .deepsource.toml\r\n-a---- 2/4/2022 9:26 PM 42 .dockerignore\r\n-a---- 2/4/2022 9:26 PM 68 .gitattributes\r\n-a---- 2/20/2022 3:11 PM 519 .gitignore\r\n-a---- 2/20/2022 3:11 PM 431 .gitpod.yml\r\n-a---- 2/20/2022 3:11 PM 437 .lgtm.yml\r\n-a---- 2/20/2022 3:11 PM 97 .pre-commit-config.yaml\r\n-a---- 2/4/2022 9:26 PM 79 .pylintrc\r\n-a---- 2/20/2022 3:14 PM 29994 CHANGELOG.md\r\n-a---- 2/4/2022 9:26 PM 112 CONTRIBUTING.md\r\n-a---- 2/20/2022 3:11 PM 3425 Dockerfile\r\n-a---- 2/4/2022 9:26 PM 1088 LICENSE\r\n-a---- 2/4/2022 9:26 PM 68 MANIFEST.in\r\n-a---- 2/20/2022 3:14 PM 480 pyproject.toml\r\n-a---- 2/20/2022 3:14 PM 3002 README.md\r\n-a---- 2/20/2022 3:14 PM 370 requirements-dev.txt\r\n-a---- 2/4/2022 9:26 PM 641 SECURITY.md\r\n-a---- 2/20/2022 3:14 PM 7739 setup.py\r\n\r\n\r\n(.venv.windows) PS C:\\Users\\Johnny\\Documents\\python\\dffml> git status\r\nRefresh index: 100% (1147/1147), done.\r\nOn branch manifest\r\nYour branch is up to date with 'pdxjohnny/manifest'.\r\n\r\nChanges not staged for commit:\r\n (use \"git add <file>...\" to update what will be committed)\r\n (use \"git restore <file>...\" to discard changes in working directory)\r\n modified: dffml/util/testing/consoletest/commands.py\r\n\r\nno changes added to commit (use \"git add\" and/or \"git commit -a\")\r\n(.venv.windows) PS C:\\Users\\Johnny\\Documents\\python\\dffml> git diff\r\ndiff --git a/dffml/util/testing/consoletest/commands.py b/dffml/util/testing/consoletest/commands.py\r\nindex 7807c99ff..f83d3fb12 100644\r\n--- a/dffml/util/testing/consoletest/commands.py\r\n+++ b/dffml/util/testing/consoletest/commands.py\r\n@@ -7,7 +7,6 @@ import sys\r\n import json\r\n import time\r\n import copy\r\n-import fcntl\r\n import shlex\r\n import signal\r\n import atexit\r\n(.venv.windows) PS C:\\Users\\Johnny\\Documents\\python\\dffml> git log -n 1\r\ncommit 80dc54afb6ee201342ba216fecfaf5ae160686a7 (HEAD -> manifest, pdxjohnny/manifest)\r\nAuthor: John Andersen <johnandersenpdx@gmail.com>\r\nDate: Sat Feb 19 20:35:22 2022 -0800\r\n\r\n operations: innersource: Fix tests to clone and check for workflows using git operations\r\n\r\n Signed-off-by: John Andersen <johnandersenpdx@gmail.com>\r\n```"
},
{
"body": "## 2022-10-10 IETF SCITT Weekly\r\n\r\n- Previous meeting notes: [2022-09-29 IETF SCITT Technical Meeting](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-3763647)\r\n- Charter is expected to be finalized by tomorrow\r\n - We had about 4+ weeks of review (which is good, we wanted to have time for people to review)\r\n - Will follow the IETF process more rigorously after initiated (we don't know all of what that entails yet :)\r\n - We will then have IETF tools at our workgroups disposal\r\n- We are currently meeting a lot\r\n - We will sawmp the upcoming meeting schdule this way\r\n - We will have three interums per two weeks if we maintain our current cadence\r\n - We might be overusing the meeting system\r\n - Two tracks\r\n - Weekly Monday\r\n - Fortnightly technical\r\n - working group formal chairs will do this\r\n - Eliot seems unlikley to have bandwidth beyond the BoF\r\n- Upcomming IETF 115\r\n - Will do sequency diagram hacking\r\n - They will have a remote experiance so that others can feel like they are in Europe at the table via 360 degree camera\r\n - Orie will be there at 1:15\r\n - Goals\r\n - Ensure we have a through software use case doc\r\n- preliminary agenda: https://datatracker.ietf.org/meeting/115/agenda/\r\n - https://www.ietf.org/how/runningcode/hackathons/115-hackathon/\r\n - https://wiki.ietf.org/en/meeting/115/hackathon\r\n - https://datatracker.ietf.org/meeting/115/important-dates/\r\n - chair logistics - Chairs 10 min\u00a0 \r\n - starting adoption of first I-D (architecture) - Henk 20 min\u00a0 \r\n - receipt definition (recap & discussion)k) - Sylvan 15 min\u00a0 \r\n - COSE merkle tree proofs (options, pros & cons) - Mailk 20 min\u00a0 \r\n - detailed use case I-D: software supply chain - Orie 25 min\r\n- How do we deal with SPDX no assertion on insert?\r\n- TODO\r\n - [ ] Add self attestations to osftware use case folow chart\r\n - [ ] Ensure we mention how this works with the standard github workflow and sigstore\r\n - [ ] I have vetted this via code review\r\n - [ ] NIST currently only cares about the presence of the SBOM as the attestation (case 0)"
}
]
},
{
"body": "# 2022-10-11 Engineering Logs\r\n\r\n- First automated async comms post worked! https://github.com/intel/dffml/actions/workflows/alice_async_comms.yml\r\n - https://docs.github.com/en/actions/creating-actions/metadata-syntax-for-github-actions#branding\r\n- SCITT\r\n - https://github.com/ietf-scitt/scitt-web/blob/main/content/what-is-scitt.md\r\n- Issue Ops\r\n - https://github.com/valet-customers/issue-ops",
"replies": [
{
"body": "## 2022-10-11 @pdxjohnny Engineering Logs\r\n\r\n- https://docs.github.com/en/actions/security-guides/automatic-token-authentication\r\n- source data flow as class\r\n - update\r\n - record to mongo doc operation\r\n - overlay/ride for custom (camel case feature keys for example)\r\n - mongo doc upsert operation\r\n- https://mobile.twitter.com/kpcyrd/status/1579617445824040960\r\n - > I don't think there's anything that can be used as an unlink(2) primitive, the Docker Image Spec has something vaguely similar by special-casing files that start with `.wh.`, putting `RUN touch /etc/.wh.os-release` in your Dockerfile deletes /etc/os-release in the final image. \ud83e\udd77\r\n- https://www.civo.com/learn/kubernetes-power-for-virtual-machines-using-kubevirt\r\n- https://github.com/kubevirt/kubevirt\r\n- https://github.com/dffml/dffml-pre-image-removal/commits/shouldi_dep_tree\r\n- https://github.com/chainguard-dev/melange/pull/128/files\r\n - Golang CLI library Cobra has docs generation\r\n- https://github.com/intel/dffml/actions/runs/3228504774/jobs/5284698480\r\n - Manifest consumption worked\r\n - https://github.com/intel/dffml/commit/0ba6357165cfd69583a7564edf8ec6d77157fcfa\r\n\r\n```\r\nError response from daemon: failed to create shim: OCI runtime create failed: runc create failed: unable to start container process: exec: \"tail\": executable file not found in $PATH: unknown\r\n```\r\n\r\n[Build: Images: Containers: .github#L1](https://github.com/intel/dffml/commit/74f80dd25577b4047429b00a880f06aaa74829bc#annotation_4889996315)\r\n```\r\nError when evaluating 'strategy' for job 'build'. intel/dffml/.github/workflows/build_images_containers.yml@74f80dd25577b4047429b00a880f06aaa74829bc (Line: 64, Col: 19): Error parsing fromJson,intel/dffml/.github/workflows/build_images_containers.yml@74f80dd25577b4047429b00a880f06aaa74829bc (Line: 64, Col: 19): Invalid property identifier character: \\. Path '[0]', line 1, position 2.,intel/dffml/.github/workflows/build_images_containers.yml@74f80dd25577b4047429b00a880f06aaa74829bc (Line: 64, Col: 19): Unexpected type of value '', expected type: Sequence.\r\n```"
}
]
},
{
"body": "# 2022-10-12 Engineering Logs",
"replies": [
{
"body": "- https://docs.github.com/en/developers/webhooks-and-events/webhooks/webhook-events-and-payloads#push\r\n- https://docs.github.com/en/actions/using-workflows/events-that-trigger-workflows#push\r\n\r\n```console\r\n$ git log -n 2\r\ncommit b6f9725a5eaa1696904a6b07ded61a27ba5e5b29 (HEAD -> alice, upstream/alice)\r\nAuthor: john-s-andersen <john.s.andersen@intel.com>\r\nDate: Wed Oct 12 18:00:57 2022 +0000\r\n\r\n util: df: internal: Fix for Python 3.9.13 hasattr not detecting NewType.__supertype__ in generator\r\n\r\n Signed-off-by: john-s-andersen <john.s.andersen@intel.com>\r\n\r\ncommit fb5d646e7099f62cb5c34b936d19c1af30c055a7\r\nAuthor: John Andersen <johnandersenpdx@gmail.com>\r\nDate: Tue Oct 11 17:56:59 2022 -0700\r\n\r\n docs: tutorials: rolling alice: forward: Add link to John^2 Living Threat Models Are Better Than Dead Threat Models talk\r\n$ gh api https://api.github.com/repos/intel/dffml/compare/fb5d646e7099f62cb5c34b936d19c1af30c055a7...b6f9725a5eaa1696904a6b07ded61a27ba5e5b29 | jq -r '.files[].filename'\r\ndffml/util/df/internal.py\r\n```\r\n\r\n- Clipped API output\r\n\r\n```json\r\n{\r\n \"files\": [\r\n {\r\n \"sha\": \"55960cf9ea7036a0fcfd68d7799ff1567a876158\",\r\n \"filename\": \"dffml/util/df/internal.py\",\r\n \"status\": \"modified\",\r\n \"additions\": 4,\r\n \"deletions\": 1,\r\n \"changes\": 5,\r\n \"blob_url\": \"https://github.com/intel/dffml/blob/b6f9725a5eaa1696904a6b07ded61a27ba5e5b29/dffml%2Futil%2Fdf%2Finternal.py\",\r\n \"raw_url\": \"https://github.com/intel/dffml/raw/b6f9725a5eaa1696904a6b07ded61a27ba5e5b29/dffml%2Futil%2Fdf%2Finternal.py\",\r\n \"contents_url\": \"https://api.github.com/repos/intel/dffml/contents/dffml%2Futil%2Fdf%2Finternal.py?ref=b6f9725a5eaa1696904a6b07ded61a27ba5e5b29\",\r\n \"patch\": \"@@ -24,6 +24,9 @@ def object_to_operations(obj, module=None):\\n obj,\\n predicate=lambda i: inspect.ismethod(i)\\n or inspect.isfunction(i)\\n- and not hasattr(i, \\\"__supertype__\\\"),\\n+ and not hasattr(i, \\\"__supertype__\\\")\\n+ # NOTE HACK\r\n Fails in 3.9.13 to remove\\n+ # NewType without the check in the str repr.\\n+ and \\\" NewType \\\" not in str(i),\\n )\\n ]\"\r\n }\r\n ]\r\n}\r\n```\r\n\r\n```python\r\nimport os\r\nimport json\r\nimport pathlib\r\nimport urllib.request\r\n\r\nowner, repository = os.environ[\"OWNER_REPOSITORY\"].split(\"/\", maxsplit=1)\r\n\r\nwith urllib.request.urlopen(\r\n urllib.request.Request(\r\n os.environ[\"COMPARE_URL\"],\r\n headers={\r\n \"Authorization\": \"bearer \" + os.environ[\"GH_ACCESS_TOKEN\"],\r\n },\r\n )\r\n) as response:\r\n response_json = json.load(response)\r\n\r\n# Build the most recent commit\r\ncommit = response_json[\"commits\"][-1][\"sha\"]\r\n\r\nmanifest = list([\r\n {\r\n \"image_name\": pathlib.Path(compare_file[\"filename\"]).stem,\r\n \"dockerfile\": compare_file[\"filename\"],\r\n \"owner\": owner,\r\n \"repository\": repository,\r\n \"branch\": os.environ[\"BRANCH\"],\r\n \"commit\": commit,\r\n }\r\n for compare_file in response_json[\"files\"]\r\n if compare_file[\"filename\"].startswith(os.environ[\"PREFIX\"])\r\n])\r\n\r\nprint(json.dumps(manifest, sort_keys=True, indent=4))\r\nprint(\"::set-output name=matrix::\" + json.dumps({\"include\": manifest}))\r\n```\r\n\r\n```console\r\n$ PREFIX=dffml GH_ACCESS_TOKEN=$(grep oauth_token < ~/.config/gh/hosts.yml | sed -e 's/ oauth_token: //g') BRANCH=main OWNER_REPOSITORY=intel/dffml COMPARE_URL=https://api.github.com/repos/intel/dffml/compare/a75bef07fd1279f1a36a601d4e652c2b97bfa1de...b6f9725a5eaa1696904a6b07ded61a27ba5e5b29 python test.py\r\n[\r\n {\r\n \"branch\": \"main\",\r\n \"commit\": \"b6f9725a5eaa1696904a6b07ded61a27ba5e5b29\",\r\n \"dockerfile\": \"dffml-base.Dockerfile\",\r\n \"image_name\": \"dffml-base\",\r\n \"owner\": \"intel\",\r\n \"repository\": \"dffml\"\r\n }\r\n]\r\n::set-output name=matrix::{\"include\": [{\"image_name\": \"dffml-base\", \"dockerfile\": \"dffml-base.Dockerfile\", \"owner\": \"intel\", \"repository\": \"dffml\", \"branch\": \"main\", \"commit\": \"b6f9725a5eaa1696904a6b07ded61a27ba5e5b29\"}]}\r\n```"
},
{
"body": "## 2022-10-12 Rolling Alice: Architecting Alice: OS DecentrAlice: Engineering Logs\r\n\r\n```console\r\n$ mkdir -p $(dirname /boot/EFI/BOOT/BOOTX64.EFI)\r\n$ cp boot/efi/EFI/Linux/linux-*.efi /boot/EFI/BOOT/BOOTX64.EFI\r\n```\r\n\r\n- New approch, fedora cloud `.iso` -> qmeu (`qemu convert .iso .qcow2`)\r\n- `qemu-img resize fedora.qcow2 +10G`\r\n- mess with partition tables to create new partition\r\n- Dump wolfi to it\r\n- Configure systemd to start sshd from wolfi\r\n- John ran out of disk space again"
}
]
},
{
"body": "# 2022-10-13 Engineering Logs\r\n\r\n- SCITT\r\n - https://github.com/ietf-scitt/scitt-web/blob/main/content/what-is-scitt.md\r\n - https://medium.com/@nis.jespersen/the-united-nations-trust-graph-d65af7b0b678\r\n - [2022-10-13 IETF SCITT Technical Meeting](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-3871185)\r\n- References\r\n - https://github.com/transmute-industries/jsonld-to-cypher",
"replies": [
{
"body": "## 2022-10-13 Rolling Alice: Architecting Alice: OS DecentrAlice: Engineering Logs\r\n\r\n- New approch, fedora cloud `.iso` -> qmeu (`qemu convert .iso .qcow2`)\r\n- `qemu-img resize fedora.qcow2 +10G`\r\n- mess with partition tables to create new partition\r\n- Dump wolfi to it\r\n- Configure systemd to start sshd from wolfi\r\n- Configure systemd to start actions runner from wolfi\r\n- Run `alice shouldi contribute` data flows\r\n- sigstore github actions OIDC token\r\n - self-attested (github assisted) scan data\r\n - SCITT OpenSSF Metrics Use Case\r\n - https://github.com/pdxjohnny/use-cases/blob/openssf_metrics/openssf_metrics.md\r\n- Future\r\n - TPM secure boot on the VM\r\n- References\r\n - https://www.qemu.org/docs/master/system/images.html\r\n - https://duckduckgo.com/?q=raw+to+qcow2&ia=web\r\n - https://www.aptgetlife.co.uk/kvm-converting-virtual-disks-from-raw-img-files-to-qcow2/\r\n - https://alt.fedoraproject.org/cloud/\r\n - https://download.fedoraproject.org/pub/fedora/linux/releases/36/Cloud/x86_64/images/Fedora-Cloud-Base-36-1.5.x86_64.raw.xz\r\n - Cloud Base compressed raw image\r\n - https://download.fedoraproject.org/pub/fedora/linux/releases/36/Cloud/x86_64/images/Fedora-Cloud-Base-36-1.5.x86_64.qcow2\r\n - Cloud Base image for Openstack\r\n\r\n```console\r\n$ qemu-img convert -O qcow2 -p Fedora-Cloud-Base-36-1.5.x86_64.raw Fedora-Cloud-Base-36-1.5.x86_64.qcow2\r\n(0.00/100%)\r\n```\r\n\r\n```console\r\n$ curl -sfLOC - https://download.fedoraproject.org/pub/fedora/linux/releases/36/Cloud/x86_64/images/Fedora-Cloud-Base-36-1.5.x86_64.qcow2\r\n$ qemu-img resize Fedora-Cloud-Base-36-1.5.x86_64.qcow2 +10G\r\n$ sudo dnf -y install guestfs-tools libvirt\r\n$ sudo systemctl enable --now libvirtd\r\n$ LIBGUESTFS_BACKEND=direct sudo -E virt-filesystems --long -h --all -a Fedora-Cloud-Base-36-1.5.x86_64.qcow2\r\nName Type VFS Label MBR Size Parent\r\n/dev/sda1 filesystem unknown - - 1.0M -\r\n/dev/sda2 filesystem ext4 boot - 966M -\r\n/dev/sda3 filesystem vfat - - 100M -\r\n/dev/sda4 filesystem unknown - - 4.0M -\r\n/dev/sda5 filesystem btrfs fedora - 3.9G -\r\nbtrfsvol:/dev/sda5/root filesystem btrfs fedora - - -\r\nbtrfsvol:/dev/sda5/home filesystem btrfs fedora - - -\r\nbtrfsvol:/dev/sda5/root/var/lib/portables filesystem btrfs fedora - - -\r\n/dev/sda1 partition - - - 1.0M /dev/sda\r\n/dev/sda2 partition - - - 1000M /dev/sda\r\n/dev/sda3 partition - - - 100M /dev/sda\r\n/dev/sda4 partition - - - 4.0M /dev/sda\r\n/dev/sda5 partition - - - 3.9G /dev/sda\r\n/dev/sda device - - - 5.0G -\r\n$ qemu-img resize Fedora-Cloud-Base-36-1.5.x86_64.qcow2 +10G\r\nImage resized.\r\n$ LIBGUESTFS_BACKEND=direct sudo -E virt-filesystems --long -h --all -a Fedora-Cloud-Base-36-1.5.x86_64.qcow2\r\nName Type VFS Label MBR Size Parent\r\n/dev/sda1 filesystem unknown - - 1.0M -\r\n/dev/sda2 filesystem ext4 boot - 966M -\r\n/dev/sda3 filesystem vfat - - 100M -\r\n/dev/sda4 filesystem unknown - - 4.0M -\r\n/dev/sda5 filesystem btrfs fedora - 3.9G -\r\nbtrfsvol:/dev/sda5/root filesystem btrfs fedora - - -\r\nbtrfsvol:/dev/sda5/home filesystem btrfs fedora - - -\r\nbtrfsvol:/dev/sda5/root/var/lib/portables filesystem btrfs fedora - - -\r\n/dev/sda1 partition - - - 1.0M /dev/sda\r\n/dev/sda2 partition - - - 1000M /dev/sda\r\n/dev/sda3 partition - - - 100M /dev/sda\r\n/dev/sda4 partition - - - 4.0M /dev/sda\r\n/dev/sda5 partition - - - 3.9G /dev/sda\r\n/dev/sda device - - - 15G -\r\n```\r\n\r\n```console\r\n$ cp Fedora-Cloud-Base-36-1.5.x86_64.qcow2.bak Fedora-Cloud-Base-36-1.5.x86_64.qcow2 $ truncate -r Fedora-Cloud-Base-36-1.5.x86_64.qcow2 Fedora-Cloud-Base-36-1.5.x86_64.2.qcow2\r\n$ truncate -s +20GB Fedora-Cloud-Base-36-1.5.x86_64.2.qcow2 $ LIBGUESTFS_BACKEND=direct sudo -E virt-resize --resize /dev/sda5=+1G Fedora-Cloud-Base-36-1.5.x86_64.qcow2 Fedora-Cloud-Base-36-1.5.x86_64.2.qcow2\r\n[ 0.0] Examining Fedora-Cloud-Base-36-1.5.x86_64.qcow2\r\n**********\r\n\r\nSummary of changes:\r\n\r\nvirt-resize: /dev/sda1: This partition will be left alone.\r\n\r\nvirt-resize: /dev/sda2: This partition will be left alone.\r\n\r\nvirt-resize: /dev/sda3: This partition will be left alone.\r\n\r\nvirt-resize: /dev/sda4: This partition will be left alone.\r\n\r\nvirt-resize: /dev/sda5: This partition will be resized from 3.9G to 4.9G.\r\n\r\nvirt-resize: There is a surplus of 13.0G. An extra partition will be\r\ncreated for the surplus.\r\n\r\n**********\r\n[ 7.9] Setting up initial partition table on Fedora-Cloud-Base-36-1.5.x86_64.2.qcow2\r\n[ 28.5] Copying /dev/sda1\r\n[ 28.5] Copying /dev/sda2\r\n 100% \u27e6\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u27e7 00:00\r\n[ 37.0] Copying /dev/sda3\r\n[ 37.3] Copying /dev/sda4\r\n[ 37.4] Copying /dev/sda5\r\n 100% \u27e6\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u2592\u27e7 00:00\r\n\r\nvirt-resize: Resize operation completed with no errors. Before deleting\r\nthe old disk, carefully check that the resized disk boots and works\r\ncorrectly.\r\n```\r\n\r\n- https://linux.die.net/man/1/virt-resize\r\n\r\n```console\r\n$ curl -sfLOC - https://download.fedoraproject.org/pub/fedora/linux/releases/36/Cloud/x86_64/images/Fedora-Cloud-Base-36-1.5.x86_64.qcow2\r\n$ qemu-img resize Fedora-Cloud-Base-36-1.5.x86_64.qcow2 +10G\r\n$ sudo dnf -y install guestfs-tools libvirt\r\n$ sudo systemctl enable --now libvirtd\r\n$ qemu-img resize Fedora-Cloud-Base-36-1.5.x86_64.qcow2 +20G\r\n$ cp Fedora-Cloud-Base-36-1.5.x86_64.qcow2.bak Fedora-Cloud-Base-36-1.5.x86_64.qcow2\r\n$ LIBGUESTFS_BACKEND=direct sudo -E virt-resize --resize /dev/sda5=+1G Fedora-Cloud-Base-36-1.5.x86_64.qcow2 Fedora-Cloud-Base-36-1.5.x86_64.2.qcow2\r\n$ qemu-system-x86_64 -no-reboot -smp cpus=2 -m 4096M -enable-kvm -nographic -cpu host -drive file=/home/pdxjohnny/Fedora-Cloud-Base-36-1.5.x86_64.2.qcow2,if=v2\r\nSeaBIOS (version 1.16.0-1.fc36)\r\n\r\n\r\niPXE (https://ipxe.org) 00:03.0 CA00 PCI2.10 PnP PMM+BFF8C110+BFECC110 CA00\r\n\r\n\r\n\r\nBooting from Hard Disk...\r\nGRUB loading.\r\nWelcome to GRUB!\r\n\r\n GNU GRUB version 2.06\r\n\r\n \u250c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2510\r\n \u2502*Fedora Linux (5.17.5-300.fc36.x86_64) 36 (Cloud Edition) \u2502\r\n```\r\n\r\n- Still seeing issues with bad superblocks\r\n- https://gist.github.com/pdxjohnny/6063d1893c292d1ac0024fb14d1e627d\r\n\r\n```\r\ne2fsck: Bad magic number in super-block while trying to open /dev/nbd1p5\r\n/dev/nbd1p5:\r\nThe superblock could not be read or does not describe a valid ext2/ext3/ext4\r\nfilesystem. If the device is valid and it really contains an ext2/ext3/ext4\r\nfilesystem (and not swap or ufs or something else), then the superblock\r\nis corrupt, and you might try running e2fsck with an alternate superblock:\r\n e2fsck -b 8193 <device>\r\n or\r\n e2fsck -b 32768 <device>\r\n\r\n```\r\n\r\n- New new approach, packer: https://www.packer.io/downloads\r\n - https://www.packer.io/plugins/builders/openstack\r\n - https://www.packer.io/plugins/builders/digitalocean\r\n - https://www.packer.io/plugins/builders/qemu\r\n - https://www.packer.io/plugins/datasources/git/commit\r\n - Manifest\r\n - https://www.packer.io/plugins/builders/digitalocean#user_data\r\n - https://gist.github.com/pdxjohnny/a0dc3a58b4651dc3761bee65a198a80d#file-run-vm-sh-L156-L205\r\n - Enable github actions on boot via systemd here\r\n- https://docs.github.com/en/packages/working-with-a-github-packages-registry/working-with-the-container-registry\r\n- https://gist.github.com/nickjj/d63d1e0ee71f4226ac5000bf1022bb38\r\n- https://gist.github.com/pdxjohnny/5f358e749181fac74a750a3d00a74b9e\r\n\r\n**osdecentralice.json**\r\n\r\n```json\r\n{\r\n \"variables\": {\r\n \"version\": \"latest\",\r\n \"do_token\": \"{{env `DIGITALOCEAN_TOKEN`}}\"\r\n },\r\n \"builders\": [\r\n {\r\n \"type\": \"digitalocean\",\r\n \"api_token\": \"{{user `do_token`}}\",\r\n \"image\": \"fedora-36-x64\",\r\n \"region\": \"sfo3\",\r\n \"size\": \"m3-2vcpu-16gb\",\r\n \"ssh_username\": \"root\",\r\n \"droplet_name\": \"osdecentralice-{{user `version`}}\",\r\n \"snapshot_name\": \"osdecentralice-{{user `version`}}-{{timestamp}}\"\r\n }\r\n ],\r\n \"provisioners\": [\r\n {\r\n \"type\": \"shell\",\r\n \"inline\": [\r\n \"set -x\",\r\n \"set -e\",\r\n \"dnf upgrade -y\",\r\n \"dnf install -y podman\",\r\n \"curl -sfLC - -o Dockerfile https://gist.github.com/pdxjohnny/5f358e749181fac74a750a3d00a74b9e/raw/f93d3831f94f58751d85f71e8e266f6020042323/Dockerfile\",\r\n \"sha256sum -c -<<<'b5f31acb1ca47c55429cc173e08820af4a19a32685c5e6c2b1459249c517cbb5 Dockerfile'\",\r\n \"podman build -t osdecentralice:latest - < Dockerfile\",\r\n \"container=$(podman run --rm -d --entrypoint tail osdecentralice -F /dev/null);\",\r\n \"trap \\\"podman kill ${container}\\\" EXIT\",\r\n \"sleep 1\",\r\n \"podman cp \\\"${container}:/\\\" /wolfi\"\r\n ]\r\n }\r\n ]\r\n}\r\n```\r\n\r\n```console\r\n$ sudo -E packer build osdecentralice.json\r\n```\r\n\r\n![image](https://user-images.githubusercontent.com/5950433/195759634-4493d348-fb66-41ba-a531-330e7e5662c7.png)\r\n\r\n```console\r\n digitalocean: --> 7b72b288ae3\r\n digitalocean: [2/2] STEP 8/8: ENTRYPOINT bash\r\n digitalocean: [2/2] COMMIT osdecentralice:latest\r\n digitalocean: --> 919ae809e98\r\n digitalocean: Successfully tagged localhost/osdecentralice:latest\r\n digitalocean: 919ae809e9841893f046cd49950c4515b04bb24db5d87f1de52168275860ebec\r\n==> digitalocean: ++ podman run --rm -d --entrypoint tail osdecentralice -F /dev/null\r\n==> digitalocean: + container=0c0d3ad9125c981aff17b78ee38c539229b444e546a4e346bc1f86d7ca0480fb\r\n==> digitalocean: + trap 'podman kill 0c0d3ad9125c981aff17b78ee38c539229b444e546a4e346bc1f86d7ca0480fb' EXIT\r\n==> digitalocean: + sleep 1\r\n==> digitalocean: + podman cp 0c0d3ad9125c981aff17b78ee38c539229b444e546a4e346bc1f86d7ca0480fb:/ /wolfi\r\n==> digitalocean: + podman kill 0c0d3ad9125c981aff17b78ee38c539229b444e546a4e346bc1f86d7ca0480fb\r\n digitalocean: 0c0d3ad9125c981aff17b78ee38c539229b444e546a4e346bc1f86d7ca0480fb\r\n==> digitalocean: Gracefully shutting down droplet...\r\n==> digitalocean: Creating snapshot: osdecentralice-latest-1665722921\r\n==> digitalocean: Waiting for snapshot to complete...\r\n==> digitalocean: Destroying droplet...\r\n==> digitalocean: Deleting temporary ssh key...\r\nBuild 'digitalocean' finished after 10 minutes 12 seconds.\r\n\r\n==> Wait completed after 10 minutes 12 seconds\r\n\r\n==> Builds finished. The artifacts of successful builds are:\r\n--> digitalocean: A snapshot was created: 'osdecentralice-latest-1665722921' (ID: 118836442) in regions 'sfo3'\r\n++ history -a\r\npdxjohnny@fedora-s-4vcpu-8gb-sfo3-01 ~ $\r\n```\r\n\r\n![image](https://user-images.githubusercontent.com/5950433/195765976-fe432d96-b2ca-4a10-a595-b82acaf0f463.png)\r\n\r\n- Now to install github actions runner in wolfi, and configure systemd to auto start it.\r\n - Ideally we figure out how to deploy a bunch of these, terraform?\r\n - They need to be ephemeral and shutdown after each job\r\n - Treat vector: Comprimise by threat actor results in system not triggering shutdown.\r\n - Mitigation: Reap out of band\r\n\r\n![image](https://user-images.githubusercontent.com/5950433/195766172-7898c5ce-de9a-48cc-a2d4-331a7e614dd3.png)\r\n\r\n```console\r\n[root@osdecentralice-latest-1665722921-s-4vcpu-8gb-sfo3-01 ~]# chroot /wolfi /usr/bin/python\r\nPython 3.10.7 (main, Jan 1 1970, 00:00:00) [GCC 12.2.0] on linux\r\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\r\n>>> import pathlib\r\n>>> print(pathlib.Path(\"/etc/os-release\").read_text())\r\nID=wolfi\r\nNAME=\"Wolfi\"\r\nPRETTY_NAME=\"Wolfi\"\r\nVERSION_ID=\"20220913\"\r\nHOME_URL=\"https://wolfi.dev\"\r\n\r\n>>>\r\n```\r\n\r\n[![asciicast](https://asciinema.org/a/528221.svg)](https://asciinema.org/a/528221)\r\n\r\n[![asciicast](https://asciinema.org/a/528220.svg)](https://asciinema.org/a/528220)\r\n\r\n[![asciicast](https://asciinema.org/a/528223.svg)](https://asciinema.org/a/528223)\r\n"
},
{
"body": "## 2022-10-13 IETF SCITT Technical Meeting\r\n\r\n- WG Chartered!\r\n - https://mailarchive.ietf.org/arch/msg/scitt/OsUTPGEUUVQGxcU1J8UostNs1iM/\r\n - https://datatracker.ietf.org/doc/charter-ietf-scitt/\r\n - https://vocabulary.transmute.industries/\r\n- Semantic Versioning\r\n - Ray would like to see this included in software use case.\r\n - Policy around update\r\n - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice/0000_architecting_alice\r\n- Facilitate post instance creation labeling\r\n - Notary adds to transparency infrastructure at a later point, how do we ensure others have access to that?\r\n - They should go query those notaries or require up to date receipts from them.\r\n- We don't care so much about what's in the SBOM, it's just data\r\n- There may be many SBOMs for a single release of software, they could be insert by multiple notaries using different scanner implementations.\r\n- Trust graphs constricuted at a later date\r\n - Orie Steele (Transmute):\r\n - 'In our world, these are \u201cgraph queries\u201d... the graphs are built from the registry data. joined with other data. I don't see SCITT as solving for graph queries\u2026 it just provides a data set that is projected into the graph'\r\n- Can't we just always use a recpit to auth?\r\n\r\nSource: https://github.com/ietf-scitt/scitt-web/blob/main/content/what-is-scitt.md\r\n\r\n![scii-persistance](https://github.com/ietf-scitt/scitt-web/raw/main/content/media/scitt-persistence.png)"
},
{
"body": "## 2022-10-13 @pdxjohnny Engineering Logs\r\n\r\n- https://github.com/actions/runner/compare/main...fgalind1:runner:k8s-support\r\n- https://github.com/uor-community/ai-model-registry\r\n - https://gist.github.com/usrbinkat/761d8f2f4da018d861451aff45b2cde7\r\n - https://universalreference.io/docs/intro\r\n - This is aligned\r\n - > Why would you want to link something like web pages or any content via attributes?\r\nThis might seem arbitrary at first glance, but it is a fundamental concept in human cognition. We describe a table to another person via its attributes i.e. Dark wood, 18x2in rectangular legs, round top... If we\u2019ve been precise enough in our description, another person would be able to pick that table out of a showroom of tables. UOR takes this concept and applies it to everything. We can then train AI models on a uniformly formatted internet containing contextually linked data.\r\n - https://www.mdpi.com/2504-2289/5/4/56/htm\r\n - > With the rapid development of 5G communications, enhanced mobile broadband, massive machine type communications and ultra-reliable low latency communications are widely supported. However, a 5G communication system is still based on Shannon\u2019s information theory, while the meaning and value of information itself are not taken into account in the process of transmission. Therefore, it is difficult to meet the requirements of intelligence, customization, and value transmission of 6G networks. In order to solve the above challenges, we propose a 6G mailbox theory, namely a cognitive information carrier to enable distributed algorithm embedding for intelligence networking. Based on Mailbox, a 6G network will form an intelligent agent with self-organization, self-learning, self-adaptation, and continuous evolution capabilities. With the intelligent agent, redundant transmission of data can be reduced while the value transmission of information can be improved. Then, the features of mailbox principle are introduced, including polarity, traceability, dynamics, convergence, figurability, and dependence. Furthermore, key technologies with which value transmission of information can be realized are introduced, including knowledge graph, distributed learning, and blockchain. Finally, we establish a cognitive communication system assisted by deep learning. The experimental results show that, compared with a traditional communication system, our communication system performs less data transmission quantity and error.\r\n- https://github.com/chainguard-dev/apko\r\n - container build pipelines but with manifests for apko\r\n- TODO\r\n - [ ] https://universalreference.io/docs/Quick%20Start/intro#publishing-a-collection\r\n - Related: #1207\r\n - https://github.com/uor-framework/uor-client-go#build-a-schema-into-an-artifact\r\n - Possibly build schema for inputs to containers as manifests emebedded / mapped to CLI or config format?"
}
]
},
{
"body": "# 2022-10-14 Engineering Logs",
"replies": [
{
"body": "## 2022-10-14 @pdxjohnny Engineering Logs\r\n\r\n- Alice helps you understand what your software is EATing, what\u2019s the health of its software supply chain (food as the biological supply chain). You are what you EAT and your software is its development health! You get out what you put in lifecycle wise.\r\n- https://github.com/ossf/scorecard/blob/main/docs/checks.md\r\n- https://gist.github.com/pdxjohnny/f56e73b82c1ea24e1e7d6b995a566984\r\n- https://github.com/sigstore/gitsign#environment-variables\r\n - > Env var | | | |\r\n > -- | -- | -- | --\r\n > GITSIGN_FULCIO_URL | \u2705 | https://fulcio.sigstore.dev | Address of Fulcio server\r\n > GITSIGN_LOG | \u274c | \u00a0 | Path to log status output. Helpful for debugging when no TTY is available in the environment.\r\n > GITSIGN_OIDC_CLIENT_ID | \u2705 | sigstore | OIDC client ID for application\r\n > GITSIGN_OIDC_ISSUER | \u2705 | https://oauth2.sigstore.dev/auth | OIDC provider to be used to issue ID token\r\n > GITSIGN_OIDC_REDIRECT_URL | \u2705 | \u00a0 | OIDC Redirect URL\r\n > GITSIGN_REKOR_URL | \u2705 | https://rekor.sigstore.dev | Address of Rekor server"
}
]
},
{
"body": "# 2022-10-15 Engineering Logs\r\n\r\n- http://blockexplorer.graft.network/\r\n- Async Comms\r\n - Examples\r\n - At 07:34 -7 UTC @pdxjohnny started drafting the tutorial: `Rolling Alice: Coach Alice: You are what you EAT!`\r\n - Others with the GitHub discussions thread loaded in their browser (at least on desktop) will see updates soon after he edits comments and replies in the thread.\r\n - Possible aligned tutorial sketch follows: `Rolling Alice: Architecting Alice: Thought Communication Protocol Case Study: DFFML`\r\n - We will combine GitHub Actions on discussion edit trigger with [`scripts/dump_discussion.py`](https://github.com/intel/dffml/blob/ed4d806cf2988793745905578a0adc1b02e7eeb6/scripts/dump_discussion.py)\r\n - We will replicate this data to DIDs and run DWN `serviceEndpoint` s as needed.\r\n - system context as service endpoint or executed locally if sandboxing / orchestrator policy permits.\r\n - See early architecting Alice Engineering Log lossy cached streams of consciousness for more detail\r\n - https://www.youtube.com/playlist?list=PLtzAOVTpO2jaHsS4o-sDzDyHEug-1KRbK\r\n - We will attest data using reusable workflows, OIDC, and sigstore\r\n - We will run more rekor / fulcio instances\r\n - We will network via webrtc and DERP\r\n - We will write orchestration operations / data flows / overlays and use data flow as class to leverage them via double context entry pattern (or some other way to do that).\r\n - We will see the same effect, but in a more DID based way with abstract implementation / infra\r\n - This will be mentioned as being a follow on to the tutorial: `Rolling Alice: Architecting Alice: Stream of Consciousness`\r\n - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0005_stream_of_consciousness.md\r\n - Alice will filter by updates relevant to the downstream receiver of events based on their current state, context, etc.\r\n - https://twitter.com/SergioRocks/status/1580545209678454784\r\n - > ![\"Because Jade had more uninterrupted Deep Work time than Brayan. Those 4 interruptions that Brayan suffered amounted for an actual loss of 3 hours of productive work on the tasks assigned to him.\" Sergio Pereira](https://pbs.twimg.com/media/Fe85fdaXgAEhe4_?format=png)\r\n - She will notify or etc. as appropriate based off prioritizer's thoughts on \r\n - **TODO** implement the prioritizer concept as another tutorial\r\n - Similar to \"Bob Online\" or \"Alice Online\" message from webhook based tutorial but ran through data flow / overlayed logic to determine relevance and what to do / say. Also it's now including Decentralized Web Nodes and DIDs. Possible next step / future in this (aligned clusters) train of thought would be:\r\n - KERI encapsulation over arbitrary channels\r\n - NLP to summarize git log changes\r\n - Hook up to git log\r\n - CI integration to serialize to sensible information format\r\n - Eventually Alice will be able to tell us whatever we want to know.\r\n - In the future (current date 2022-10-15), when you want to know something\r\n about Alice, she'll be able to tell you, because she knows about her\r\n own codebase, and she has solid foundations for security and trust and\r\n alignment with your strategic principles / values. She's a trustworthy\r\n messenger, the Ghost in the shell.\r\n - See discussion thread (or the thread dump in `docs/arch/alice/discussion`)\r\n - https://github.com/intel/dffml/tree/alice/docs/arch/alice/discussion\r\n - `$ git log -p --reverse -p -- docs/arch/alice/discussion`\r\n - https://github.com/intel/dffml/discussions/1369",
"replies": [
{
"body": "# Rolling Alice: Coach Alice: You are what you EAT!\r\n\r\nAlice helps you understand what your software is EATing, what's\r\nthe health of its software supply chain (food as the biological supply\r\nchain). You are what you EAT and your software is its development health!\r\nYou get out what you put in lifecycle wise.\r\n\r\nAlice is our software developer coach. She helps us help ourselves.\r\nIf Alice was coaching us on being healthier person, she would tell\r\nus to look at our digestion! When building software our measuring the\r\nhealth of our digestion is aligned with measuring our progress towards\r\nreaching critical velocity.\r\n\r\nIn this tutorial we'll follow on to the Down the Dependency Rabbit Hole\r\nAgain tutorial and get more into seeing the lifecycle of the project\r\nand it's health as critical in the security of the project. We'll\r\ntreat the health of the lifecycle as an asset to be protected in our\r\nthreat model `alice threats` / `THREATS.md`.\r\n\r\n- References\r\n - https://github.com/johnlwhiteman/living-threat-models\r\n - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0001_coach_alice/0001_down_the_dependency_rabbit_hole_again.md\r\n - https://cloud.google.com/blog/products/devops-sre/dora-2022-accelerate-state-of-devops-report-now-out\r\n - DORA metrics\r\n - Culture\r\n - happiness == good\r\n - **TODO** find link about happiness in article based of 2022 dora report results\r\n - https://www.gutenberg.org/files/11/11-h/11-h.htm\r\n - https://colab.research.google.com/drive/1gol0M611zXP6Zpggfri-fG8JDdpMEpsI\r\n - Trying to generate images for this tutorial using the public domain images from the original Alice's Adventures in Wonderland as overlays (img2img)..."
},
{
"body": "## 2022-10-15 @pdxjohnny Engineering Logs\r\n\r\n- Wolfi\r\n - https://edu.chainguard.dev/open-source/apko/overview/\r\n- Packer\r\n - https://www.packer.io/docs/post-processors/manifest\r\n- https://github.com/intel/dffml/issues/1334\r\n- Vol 6: Happy happy joy joy\r\n - Positive thinking\r\n - Document one up and one down\r\n- Vol 6: intro: Then it\u2019s a wonderful dream\r\n - Sequence similar to Peace at Last\r\n - Alice: \u201cMaybe it\u2019s a dream?\u201d\r\n - \u201cThen it\u2019s a wonderful dream\u201d"
}
]
},
{
"body": "# 2022-10-16 Engineering Logs",
"replies": [
{
"body": "- stable diffusion\r\n - https://github.com/divamgupta/stable-diffusion-tensorflow/pull/50/files\r\n- reinforcement learning\r\n - https://arxiv.org/abs/1903.00714\r\n - RL for supply chain\r\n - https://github.com/facebookresearch/mvfst-rl\r\n - > mvfst-rl is a framework for network congestion control in the QUIC transport protocol that leverages state-of-the-art in asynchronous Reinforcement Learning training with off-policy correction.\r\n- GitHub Actions\r\n - https://github.com/GoogleContainerTools/kaniko#running-kaniko-in-docker\r\n - See if updating the `build_images_containers.yml` works if we add these volume mounts and so forth.\r\n - There may have been an OCI image issue. Maybe we can rebuild and push in docker format?\r\n - Lets just switch to podman or docker onstead of kaniko because we know that works on actions\r\n- Container Registry\r\n - Provide on demand image builds where final layers are just added staticlly\r\n - https://github.com/ImJasonH/kontain.me\r\n - https://github.com/google/go-containerregistry/blob/a0f66878d01286cac42d99fb45e3b335710c00a5/pkg/v1/random/image.go\r\n - These layers then have their SBOM added where they have provenance as the data provenance for the addition of the layer\r\n - Then we have content addressability and SBOM and provenance from sigstore etc. via existing registry interoperability tooling\r\n - Compute contracts can be issued by having the pull from the registry be authed by verifiable credential\r\n - Registry releases content addressable verifiable with SCITT recpit of release (because data might be sensitive, need confirmed release in case of need to revoke / roll keys)\r\n- Created DigitalOcean space data.nahdig.com\r\n - data.nahdig.com is for data with suspect provenance\r\n - No `.` in any names in DO spaces! Certs will fail!\r\n - We have taken no steps to think about hardening on OS DecentrAlice yet within context of scanning\r\n - We should assume VM compromise, aka, data is best effort\r\n - Hence nahdig\r\n - Data from systems with provenance and hardening will be served from data.chadig.com\r\n - https://nahdig.sfo3.cdn.digitaloceanspaces.com/\r\n - https://nahdig.sfo3.digitaloceanspaces.com/\r\n - https://data.nahdig.com/\r\n - `contribute.shouldi.alice.data.nahdig.com`\r\n\r\n![create-digitalocean-space-data.nahdig.com](https://user-images.githubusercontent.com/5950433/196057425-a8b74ec5-9c24-42d3-8693-373a61be5d13.png)\r\n"
}
]
},
{
"body": "# 2022-10-17 Engineering Logs\r\n\r\n- https://github.com/m00sey/canis\r\n- https://github.com/ioflo\r\n- https://github.com/decentralized-identity/keri/blob/master/kids/kid0003.md\r\n- https://github.com/build-trust/ockam\r\n - > trust for data\r\n - https://github.com/build-trust/ockam/tree/develop/documentation/use-cases/end-to-end-encrypt-all-application-layer-communication#readme\r\n- https://github.com/WebOfTrust/keri-dht-py\r\n - ~~Try spinning this up~~ outdated\r\n - https://github.com/WebOfTrust/keri\r\n - Process side note: We could communicate with Alice by having her post a discussion comment reply and then edit it to include instructions, she then fills reply with work / (sub) list items with her summary of progress/ results\r\n- https://github.com/ioflo/hio\r\n- TODO\r\n - [ ] Docker and ghcr builds and packer do build\r\n - [ ] Infra DO automation as operations executed in preapply? Of k8s job orchestrator\r\n - [ ] Deploy k3s by default in vm os image\r\n - [ ] Run actions runner controller on VMs\r\n - [ ] Run scan from github actions self hosted DO backed\r\n - [ ] Crawler to find repos",
"replies": [
{
"body": "## 2022-10-17 @pdxjohnny Engineering Logs\r\n\r\n- https://w3c.github.io/dpv/dpv/\r\n- https://github.com/GLEIF-IT/sally\r\n- https://github.com/comunica/comunica/tree/master/engines/query-sparql#readme\r\n - https://www.w3.org/TR/sparql11-update/\r\n - Could be used during tbDEX negotiation of compute contract\r\n- https://ruben.verborgh.org/blog/2018/12/28/designing-a-linked-data-developer-experience/ \r\n - https://comunica.github.io/Article-ISWC2018-Demo-GraphQlLD/\r\n - https://comunica.github.io/Article-ISWC2018-Resource/\r\n - > Local and remote dataset dumps in RDF serializations\r\n - https://ontola.io/blog/rdf-serialization-formats/#tldr\r\n - https://comunica.dev/research/link_traversal/\r\n - https://comunica.github.io/comunica-feature-link-traversal-web-clients/builds/solid-prov-sources/#transientDatasources=https%3A%2F%2Fwww.rubensworks.net%2F\r\n - could post cached serializations to github pages to uodate as CMS\r\n - Could extend to execute data flows on resolution (hiting and endpoint)\r\n - Need to figure out how to serialize, will analyze data from demos to look for patterns in links and resolvable URLS\r\n - Will try to use localhost run and python builtin http.server to query data\r\n - Stand up query server if nessicary\r\n - Wget mirror to cache everything or something like that\r\n - Then need to figure out sigstore / rekor provenance\r\n - http://videolectures.net/iswc2014_verborgh_querying_datasets/\r\n - https://github.com/rdfjs/comunica-browser\r\n - https://github.com/LinkedDataFragments/Server.js/blob/6bdb7f4af0af003213c4765065961ca77594aa63/packages/datasource-sparql/lib/datasources/SparqlDatasource.js#L31-L76\r\n- Cloud Development Environments\r\n - https://github.com/coder/coder/tree/main/examples/templates/do-linux\r\n - https://github.com/nestybox/sysbox\r\n - https://coder.com/docs/coder-oss/latest/templates/change-management\r\n - https://coder.com/docs/coder-oss/latest/secrets#dynamic-secrets\r\n - > Dynamic secrets are attached to the workspace lifecycle and automatically injected into the workspace. With a little bit of up front template work, they make life simpler for both the end user and the security team. This method is limited to [services with Terraform providers](https://registry.terraform.io/browse/providers), which excludes obscure API providers.\r\n - https://coder.com/docs/coder-oss/latest/admin/automation\r\n - Example uses https://registry.terraform.io/providers/RJPearson94/twilio/latest/docs/resources/iam_api_key\r\n - https://github.com/RJPearson94/terraform-provider-twilio/blob/07460ebdef45d59a52eef13f8bdb9ff0a7219c83/twilio/provider.go#L46\r\n - > `Sensitive: true,`\r\n - https://github.com/RJPearson94/terraform-provider-twilio/blob/61b96f0beb6e5827037ddf2db7b160b52df7c666/examples/credentials/aws/outputs.tf\r\n - https://github.com/hashicorp/terraform-provider-external/blob/1aff6be074b053de5cc86ca3dc5cac122e8cedcd/internal/provider/test-programs/tf-acc-external-data-source/main.go#L34-L37\r\n - https://www.terraform.io/language/functions/sensitive\r\n - https://coder.com/docs/coder-oss/latest/dotfiles\r\n - https://coder.com/docs/coder-oss/latest/templates#parameters\r\n - https://registry.terraform.io/providers/hashicorp/external/latest/docs/data-sources/data_source\r\n - Store secrets in GitHub\r\n - Run workflow\r\n - Network with DERP\r\n - Start callback endpoint on port 0 for random port (`dffml-service-http`)\r\n - https://pkg.go.dev/tailscale.com/derp\r\n - > Package derp implements the Designated Encrypted Relay for Packets (DERP) protocol. DERP routes packets to clients using *curve25519* keys as addresses. DERP is used by Tailscale nodes to proxy encrypted WireGuard packets through the Tailscale cloud servers when a direct path cannot be found or opened. DERP is a last resort. Both sides between very aggressive NATs, firewalls, no IPv6, etc? Well, DERP.\r\n - Send back secrets and OIDC token to callback endpoint using public key provided as input (TODO KERI)\r\n- Web UI Testing\r\n - https://github.com/mobile-dev-inc/maestro\r\n- DID\r\n - https://github.com/orgs/w3c/repositories?language=&q=did&sort=&type=all\r\n - https://w3c.github.io/did-imp-guide/\r\n - https://github.com/w3c/did-spec-registries/compare/main...pdxjohnny:did-spec-registries:open-architecture-and-alice\r\n - Need to understand if this is appropriate\r\n - Goal: Define how DID operations could be used to execute the content addressable contracts\r\n - See kontian.me references and notes towards bottom of today's engineering logs\r\n - `did:alice:sha256:01`\r\n - https://identity.foundation/keri/did_methods/\r\n - https://w3c.github.io/did-rubric/\r\n\r\n### DID Method Registration\r\n\r\nAs a DID method registrant, I have ensured that my DID method registration complies with the following statements:\r\n\r\n- [ ] The DID Method specification [defines the DID Method Syntax](https://w3c.github.io/did-core/#method-syntax).\r\n- [ ] The DID Method specification [defines the Create, Read, Update, and Deactivate DID Method Operations](https://w3c.github.io/did-core/#method-operations).\r\n- [ ] The DID Method specification [contains a Security Considerations section](https://w3c.github.io/did-core/#security-requirements).\r\n- [ ] The DID Method specification [contains a Privacy Considerations section](https://w3c.github.io/did-core/#privacy-requirements).\r\n- [ ] The JSON file I am submitting has [passed all automated validation tests below](#partial-pull-merging).\r\n- [x] The JSON file contains a `contactEmail` address [OPTIONAL].\r\n- [x] The JSON file contains a `verifiableDataRegistry` entry [OPTIONAL].\r\n - There will be a registry but primarily this our goal is to enable sandboxed distributed compute\r\n\r\n---\r\n\r\n- DFFML\r\n - Write operations, use octx.ictx directly:\r\n - memory_input_network_input_context_to_dict\r\n - dict_to_json\r\n - dict_to_did_serialized\r\n - Takes Credential Manifest (and wallet ref?)\r\n - memory_ memory_ memory_input_network_input_context_merge_from_dict\r\n - dict_from_json\r\n - dict_to_did_serialized\r\n - Takes Credential Manifest? Or JSON-LD / graphql-ld or maybe just data flow to validate verifiable credentials needed are present (and wallet ref?)\r\n - https://w3c.github.io/did-rubric/\r\n - memory_input_network_serve_strawberry_graphql\r\n - graphql_query\r\n - watch_for_compute_contracts\r\n - Watch stream of consciousness for new compute contracts read / verify via container image on demand registry\r\n - Eventually overlay for input network and associated operations to keep more performant series snapshot data. `List[memory_input_network_input_context_to_dict.outputs.result]` for each change to the input network. Enables rollback to any point as cached state or modification throughout.\r\n- Kubernetes\r\n - https://k3s.io/\r\n - https://github.com/k3s-io/k3s/releases/tag/v1.25.2%2Bk3s1\r\n - Add to OS DecentrAlice\r\n- apko\r\n - https://github.com/chainguard-dev/apko/tree/main/examples\r\n- KCP\r\n - https://github.com/kcp-dev/kcp\r\n - > kcp is a Kubernetes-like control plane focusing on: A control plane for many independent, isolated \"clusters\" known as workspaces\r\n - Great, this could satisfy our workspace manager component requirement\r\n within the abstract compute architecture.\r\n - Add to OS DecentrAlice\r\n - Need to figure out how to DWN network on boot and establish webrtc channels\r\n (or other channels).\r\n - Need to figure out how to automate and make cluster config / discovery dynamic\r\n and transparent on each running user instance of OS DecentrAlice.\r\n - Enable two use cases\r\n - Automated deployment, autostart on boot systemd config UNIX socket for kcp\r\n - End user on system, autostart on boot user login systemd config UNIX socket for kcp\r\n\r\n```mermaid\r\ngraph TD\r\n subgraph abstract_compute_architecture[Abstract Compute Architecture]\r\n derp[DERP Server]\r\n subgraph devenv[Developer Environment]\r\n editor[Editor]\r\n terminal[Terminal]\r\n browser[Browser]\r\n end\r\n workspace_management[Workspace Management]\r\n iasc[Infrastructure as Code]\r\n osdecentralice[OS DecentrAlice]\r\n\r\n editor --> |http2| derp\r\n terminal --> |http2| derp\r\n browser --> |http2| derp\r\n\r\n derp --> workspace_management\r\n workspace_management --> iasc\r\n\r\n iasc --> kcp\r\n kcp --> k3s\r\n k3s --> osdecentralice\r\n\r\n derp --> osdecentralice\r\n end\r\n```\r\n\r\n - https://github.com/kcp-dev/kcp/blob/main/docs/concepts.md\r\n - https://github.com/kcp-dev/kcp/blob/main/docs/virtual-workspaces.md\r\n - https://github.com/kcp-dev/kcp/blob/main/docs/content/en/main/concepts/workspaces.md\r\n - > Multi-tenancy is implemented through workspaces. A workspace is a Kubernetes-cluster-like HTTPS endpoint, i.e. an endpoint usual Kubernetes client tooling (client-go, controller-runtime and others) and user interfaces (kubectl, helm, web console, ...) can talk to like to a Kubernetes cluster.\r\n- Downstream validation / stream of consciousness tutorial part\r\n - Automating an entities post to the daily engineering logs\r\n - Via receipt of downstream event and trigger of graphql comment\r\n reply addition to thread.\r\n- TODO\r\n - [ ] SECURITY Check KCP hard/soft multi-tenancy threat model info\r\n or ascertain if not present."
}
]
},
{
"body": "# 2022-10-18 Engineering Logs",
"replies": [
{
"body": "## 2022-10-18 @pdxjohnny Engineering Logs\r\n\r\n- https://github.com/OR13/didme.me\r\n - Goal: Connect this to our content addressable (container images) compute contract stuff\r\n - Seeing ipfs project id errors on did resolution in deployed demo\r\n - Cloning to see what's up...\r\n - https://classic.yarnpkg.com/en/docs/install#centos-stable\r\n - https://github.com/transmute-industries/verifiable-actions\r\n - https://lucid.did.cards/identifiers/did:key:z6MkrJx9cCCpu7D1Scy7QovGeWShHzfSPHJXxNq5TwbZzkRF\r\n - https://api.did.actor/v/eJylkluTmjAAhf9L9lVuQRHy1PVa1hV1FW-dTidA0CgCkiDqjv-9Qbfu5a3tDC-QMyfnfIdX8M1PYk6OHKAfYM15ypCiFEUhF7qcZCsFqpqp-BkJSMwpjphy0EDlXajT4Co7-FJGDomPOU1iKaKMS1CFaqn-WQE0AAjkWYzynAZIgx7ENQNKej2sSlWz6kkWVLFkaoFp-pDU61gXd_BTSspQU5LRkGIvIs17jKspYznJhHEgPLfkhM5Gf8vp-LzvWNt9EbjmdBs0esea0V6c52G6ip2h-9g9xyn1HToLY3DzwLFPWpiLu4Aoq0qqJp6JZiGoI1hdCtV7_THHPGcAvd4q_cGAUyqLFDL2eZIpX0AwRZM3LIkf1Hsp8HKXPAtFSerNuQKyT0d2HJAjQOrX7x9Q_GUMcPlUKPc2xOf3RiVLcsS7NCJiJ70Up1mShKXgLXs7gLWaZo3pKhaRM1L-ITdIAmJwpQihJEBq5gRCpBqoapYUD9cdb4n6hK-T4D-2e_iHsa9FhnmWJqzsgRkj2YcwFbApxLSAnJ7WXtenA_rUWbZfJqOxzeydDZ2mbSx3HeZDV7w7Jzwf0UHE6GKzUO1Is2R519nXnPHam-xC95dUJaQlhaee1js0u43mqWX2oNtsuI5r9fXFLHkeH767Z1Lf14c5nLoR59Cw54O6ZzT0Ge8VZDRtF_PHEbhcfgMlFDfZ\r\n - https://stackoverflow.com/questions/69692842/error-message-error0308010cdigital-envelope-routinesunsupported\r\n\r\n[![use-the-source](https://img.shields.io/badge/use%20the-source-blueviolet)](https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_easter_eggs.md#use-the-source-)\r\n\r\n```console\r\n$ git clone https://github.com/OR13/didme.me\r\n$ yarn install\r\n$ yarn start\r\nfailure...\r\n$ npx next\r\nready - started server on 0.0.0.0:3000, url: http://localhost:3000\r\ninfo - Using webpack 4 in Next.js is deprecated. Please upgrade to using webpack 5: https://nextjs.org/docs/messages/webpack5\r\nwarn - You have enabled experimental feature(s).\r\nwarn - Experimental features are not covered by semver, and may cause unexpected or broken application behavior. Use them at your own risk.\r\n\r\nnode:internal/crypto/hash:71\r\n this[kHandle] = new _Hash(algorithm, xofLen);\r\n ^\r\n\r\nError: error:0308010C:digital envelope routines::unsupported\r\n at new Hash (node:internal/crypto/hash:71:19)\r\n at Object.createHash (node:crypto:133:10)\r\n at module.exports.__webpack_modules__.18768.module.exports (/home/pdxjohnny/didme.me/node_modules/next/dist/compiled/webpack/bundle4.js:78057:62)\r\n at NormalModule._initBuildHash (/home/pdxjohnny/didme.me/node_modules/next/dist/compiled/webpack/bundle4.js:51469:16)\r\n at handleParseError (/home/pdxjohnny/didme.me/node_modules/next/dist/compiled/webpack/bundle4.js:51523:10)\r\n at /home/pdxjohnny/didme.me/node_modules/next/dist/compiled/webpack/bundle4.js:51555:5\r\n at /home/pdxjohnny/didme.me/node_modules/next/dist/compiled/webpack/bundle4.js:51410:12\r\n at /home/pdxjohnny/didme.me/node_modules/next/dist/compiled/webpack/bundle4.js:20871:3\r\n at iterateNormalLoaders (/home/pdxjohnny/didme.me/node_modules/next/dist/compiled/webpack/bundle4.js:20712:10)\r\n at Array.<anonymous> (/home/pdxjohnny/didme.me/node_modules/next/dist/compiled/webpack/bundle4.js:20703:4) {\r\n opensslErrorStack: [ 'error:03000086:digital envelope routines::initialization error' ],\r\n library: 'digital envelope routines',\r\n reason: 'unsupported',\r\n code: 'ERR_OSSL_EVP_UNSUPPORTED'\r\n}\r\n\r\nNode.js v18.11.0\r\npdxjohnny@fedora-s-4vcpu-8gb-sfo3-01 didme.me $ npx next --help^C\r\npdxjohnny@fedora-s-4vcpu-8gb-sfo3-01 didme.me $\r\npdxjohnny@fedora-s-4vcpu-8gb-sfo3-01 didme.me $ NODE_OPTIONS=--openssl-legacy-provider npx next\r\nready - started server on 0.0.0.0:3000, url: http://localhost:3000\r\ninfo - Using webpack 4 in Next.js is deprecated. Please upgrade to using webpack 5: https://nextjs.org/docs/messages/webpack5\r\nwarn - You have enabled experimental feature(s).\r\nwarn - Experimental features are not covered by semver, and may cause unexpected or broken application behavior. Use them at your own risk.\r\n\r\nevent - compiled successfully\r\nAttention: Next.js now collects completely anonymous telemetry regarding usage.\r\nThis information is used to shape Next.js' roadmap and prioritize features.\r\nYou can learn more, including how to opt-out if you'd not like to participate in this anonymous program, by visiting the following URL:\r\nhttps://nextjs.org/telemetry\r\n\r\n\r\n```\r\n\r\n- Live at http://pdxjohnny.devbox.nahdig.com:3000/\r\n\r\n![image](https://user-images.githubusercontent.com/5950433/196558275-ab6e59fb-3e75-44d0-abac-296167b62628.png)\r\n\r\n- Same error, but with traceback popup modal\r\n\r\n```\r\nUnhandled Runtime Error\r\nHTTPError: project id required\r\n\r\nCall Stack\r\n<unknown>\r\nhttperror: project id required\r\nObject.errorHandler [as handleError]\r\nnode_modules/ipfs-http-client/src/lib/core.js (67:0)\r\nasync Client.fetch\r\nnode_modules/ipfs-utils/src/http.js (140:0)\r\nasync addAll\r\nnode_modules/ipfs-http-client/src/add-all.js (19:0)\r\nasync last\r\nnode_modules/it-last/index.js (13:0)\r\n$ git grep ipfs-http-client\r\ncore/ipfs.ts:const ipfsHttpClient = require(\"ipfs-http-client\");\r\n```\r\n\r\n- Attempting to fix IPFS HTTP client code to auth to valid server\r\n- References\r\n - https://github.com/OR13/didme.me/blob/14da8e47d8a1a4bef3cc1c85968c9f8b6963d269/core/ipfs.ts\r\n - https://infura.io/product/ipfs\r\n - Requires API keys, can we run IPFS to HTTP API ourself?\r\n - https://github.com/fission-codes/ipfs-cluster-aws\r\n - https://duckduckgo.com/?q=ipfs+did&ia=web\r\n - https://ipfscluster.io/documentation/deployment/\r\n - https://npm.devtool.tech/ipfs-did-document\r\n - https://github.com/ipfs/js-ipfs/tree/master/packages/ipfs-http-client#readme\r\n - https://github.com/ipfs-examples/js-ipfs-examples/tree/master#ipfs-or-ipfs-core\r\n - https://github.com/ipfs/js-ipfs/tree/master/packages/ipfs-http-server\r\n- Starting javascript ipfs-http-server\r\n\r\n```console\r\n$ yarn add --dev ipfs ipfs-http-server\r\n$ ./node_modules/.bin/jsipfs daemon --offline\r\nInitializing IPFS daemon...\r\nSystem version: x64/linux\r\nNode.js version: 18.11.0\r\nSwarm listening on /ip4/127.0.0.1/tcp/4002/p2p/12D3KooWRunqtKfjPSHsF24iPdrxVQ2gnhBNtBMBKsz6zj6KoXTR\r\nSwarm listening on /ip4/143.110.152.152/tcp/4002/p2p/12D3KooWRunqtKfjPSHsF24iPdrxVQ2gnhBNtBMBKsz6zj6KoXTR\r\nSwarm listening on /ip4/10.48.0.5/tcp/4002/p2p/12D3KooWRunqtKfjPSHsF24iPdrxVQ2gnhBNtBMBKsz6zj6KoXTR\r\nSwarm listening on /ip4/10.124.0.2/tcp/4002/p2p/12D3KooWRunqtKfjPSHsF24iPdrxVQ2gnhBNtBMBKsz6zj6KoXTR\r\nSwarm listening on /ip4/10.88.0.1/tcp/4002/p2p/12D3KooWRunqtKfjPSHsF24iPdrxVQ2gnhBNtBMBKsz6zj6KoXTR\r\nSwarm listening on /ip4/127.0.0.1/tcp/4003/ws/p2p/12D3KooWRunqtKfjPSHsF24iPdrxVQ2gnhBNtBMBKsz6zj6KoXTR\r\njs-ipfs version: 0.16.1\r\nHTTP API listening on /ip4/127.0.0.1/tcp/5002/http\r\ngRPC listening on /ip4/127.0.0.1/tcp/5003/ws\r\nGateway (read only) listening on /ip4/127.0.0.1/tcp/9090/http\r\nWeb UI available at http://127.0.0.1:5002/webui\r\nDaemon is ready\r\n(node:415890) ExperimentalWarning: The Fetch API is an experimental feature. This feature could change at any time\r\n(Use `node --trace-warnings ...` to show where the warning was created)\r\n```\r\n\r\n\r\n```console\r\n$ ./node_modules/.bin/jsipfs cat /ipfs/QmRaaUwTNfwgFZpeUy8qrZwrp2dY4kCKmmB5xEqvH3vtD1/readme\r\n(node:288039) ExperimentalWarning: The Fetch API is an experimental feature. This feature could change at any time\r\n(Use `node --trace-warnings ...` to show where the warning was created)\r\nHello and Welcome to IPFS!\r\n\r\n\u2588\u2588\u2557\u2588\u2588\u2588\u2588\u2588\u2588\u2557 \u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2557\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2557\r\n\u2588\u2588\u2551\u2588\u2588\u2554\u2550\u2550\u2588\u2588\u2557\u2588\u2588\u2554\u2550\u2550\u2550\u2550\u255d\u2588\u2588\u2554\u2550\u2550\u2550\u2550\u255d\r\n\u2588\u2588\u2551\u2588\u2588\u2588\u2588\u2588\u2588\u2554\u255d\u2588\u2588\u2588\u2588\u2588\u2557 \u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2557\r\n\u2588\u2588\u2551\u2588\u2588\u2554\u2550\u2550\u2550\u255d \u2588\u2588\u2554\u2550\u2550\u255d \u255a\u2550\u2550\u2550\u2550\u2588\u2588\u2551\r\n\u2588\u2588\u2551\u2588\u2588\u2551 \u2588\u2588\u2551 \u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2551\r\n\u255a\u2550\u255d\u255a\u2550\u255d \u255a\u2550\u255d \u255a\u2550\u2550\u2550\u2550\u2550\u2550\u255d\r\n\r\nIf you're seeing this, you have successfully installed\r\nIPFS and are now interfacing with the ipfs merkledag!\r\n\r\n -------------------------------------------------------\r\n| Warning: |\r\n| This is alpha software. Use at your own discretion! |\r\n| Much is missing or lacking polish. There are bugs. |\r\n| Not yet secure. Read the security notes for more. |\r\n -------------------------------------------------------\r\n\r\nCheck out some of the other files in this directory:\r\n\r\n ./about\r\n ./help\r\n ./quick-start <-- usage examples\r\n ./readme <-- this file\r\n ./security-notes\r\n```\r\n\r\n- https://github.com/ipfs/js-ipfs/search?l=JavaScript&p=1&q=js-ipfs+version\r\n - https://github.com/ipfs/js-ipfs/blob/74aee8b3d78f233c3199a3e9a6c0ac628a31a433/packages/ipfs-cli/src/commands/daemon.js#L103\r\n - https://www.npmjs.com/package/@libp2p/logger\r\n - https://github.com/ipfs/js-ipfs/blob/74aee8b3d78f233c3199a3e9a6c0ac628a31a433/packages/ipfs-cli/src/commands/daemon.js#L83-L84\r\n - https://github.com/ipfs/js-ipfs/blob/dfc43d4e9be67fdf25553677f469379d966ff806/packages/ipfs-daemon/src/index.js#L11\r\n\r\n```console\r\n$ echo '{\"Addresses\": [\"0.0.0.0\"]}' | python -m json.tool | tee init_config.json\r\n$ echo -e 'export PATH=\"${PATH}:${HOME}/didme.me/node_modules/.bin\"' | tee -a ~/.bashrc ~/.bash_profile\r\n$ DEBUG=ipfs:* ./node_modules/.bin/jsipfs daemon --offline --init-config init_config.json 2>&1 | tee output.txt Initializing IPFS daemon...\r\nSystem version: x64/linux\r\nNode.js version: 18.11.0\r\n2022-10-19T02:03:35.088Z ipfs:daemon starting\r\n2022-10-19T02:03:35.098Z ipfs:repo opening at: /home/pdxjohnny/.jsipfs\r\n2022-10-19T02:03:35.099Z ipfs:repo init check\r\n2022-10-19T02:03:35.111Z ipfs:repo:lock:fs locking /home/pdxjohnny/.jsipfs/repo.lock\r\n2022-10-19T02:03:35.122Z ipfs:repo acquired repo.lock\r\n2022-10-19T02:03:35.125Z ipfs:repo:version comparing version: 12 and 12\r\n2022-10-19T02:03:35.132Z ipfs:repo creating datastore\r\n2022-10-19T02:03:35.146Z ipfs:repo creating blocks\r\n2022-10-19T02:03:35.148Z ipfs:repo creating keystore\r\n2022-10-19T02:03:35.149Z ipfs:repo creating pins\r\n2022-10-19T02:03:35.150Z ipfs:repo all opened\r\n2022-10-19T02:03:35.289Z ipfs:components:ipns initializing IPNS keyspace (offline)\r\n2022-10-19T02:03:35.341Z ipfs:daemon Using wrtc for webrtc support\r\n2022-10-19T02:03:42.943Z ipfs:mfs:stat Fetching stats for /\r\n2022-10-19T02:03:42.968Z ipfs:mfs:utils:with-mfs-root Loaded MFS root /ipfs/QmUNLLsPACCz1vLxQVkXqqLX5R1X345qqfHbsf67hvA3Nn\r\n2022-10-19T02:03:43.467Z ipfs:mfs-preload monitoring MFS root QmUNLLsPACCz1vLxQVkXqqLX5R1X345qqfHbsf67hvA3Nn\r\n2022-10-19T02:03:43.468Z ipfs:http-api starting\r\n2022-10-19T02:03:45.190Z ipfs:cli TypeError: Cannot read properties of undefined (reading 'info')\r\n at HttpApi.start (file:///home/pdxjohnny/didme.me/node_modules/ipfs-http-server/src/index.js:119:52)\r\n at async Daemon.start (file:///home/pdxjohnny/didme.me/node_modules/ipfs-daemon/src/index.js:43:5)\r\n at async Object.handler (file:///home/pdxjohnny/didme.me/node_modules/ipfs-cli/src/commands/daemon.js:99:7)\r\n```\r\n\r\n---\r\n\r\n- https://github.com/laurent85v/archuseriso\r\n- https://mags.zone/help/arch-usb.html\r\n - This website is awesome\r\n\r\n![image](https://user-images.githubusercontent.com/5950433/196555852-ef9356e9-bcb2-4991-bce5-9cc9e8c0b2c2.png)\r\n\r\n- https://github.com/dylanaraps/pywal\r\n- https://github.com/arcmags/ramroot\r\n- https://github.com/justinpinkney/stable-diffusion#fine-tuning\r\n - See if we can do software / open architecture/ data flow / alice as input/output\r\n- https://github.com/google/prompt-to-prompt\r\n- https://github.com/dragonflydb/dragonfly\r\n- Content addressable service endpoints\r\n - Resolvable via system context execution\r\n - How to chain Verifiable Credential requests and executions?\r\n- Questions for Orie\r\n - Where to focus implementation work?\r\n - What processes to be aware of?\r\n - Best practices\r\n - Spec writing\r\n - DID method\r\n - Applicability with content addressable hybrid off chain execution via services endpoints?\r\n- What groups to be aware of?\r\n- https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0006_os_decentralice.md\r\n - Updated from engineering logs: [2022-10-13 Rolling Alice: Architecting Alice: OS DecentrAlice: Engineering Logs](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-3870218)\r\n - Next steps\r\n - https://www.packer.io/plugins/builders/qemu\r\n- https://hackaday.io/project/187780-wifi-cam-20\r\n- https://github.com/chainguard-dev/text4shell-policy/blob/284462ddb9cd9025ca0efa1d9f74c8f681ed622e/slsa.csv \r\n- https://docs.google.com/document/d/17n8hfdPfqfpbPj4ss-ep4nCkpp9ZBoy6U2Q1t7j-knI/edit\r\n - https://twitter.com/mfosterio/status/1582089134436294656\r\n- https://www.youtube.com/watch?v=LUF7plExdv8\r\n - https://json-ld.org/\r\n - https://twitter.com/mfosterio/status/1582072270083993600\r\n - https://github.com/rubensworks/jsonld-streaming-parser.js\r\n - We don't care about parsing yet (we might when loading caching)\r\n - We'll prototype with serialization and query via https://comunica.dev\r\n - https://github.com/rubensworks/jsonld-streaming-serializer.js\r\n - https://json-ld.org/contexts/person.jsonld\r\n - http://xmlns.com/foaf/0.1/#term_Agent\r\n - https://github.com/digitalbazaar/pyld\r\n - SECURITY Unmaintained since Aug 6th 2020\r\n - `jsonld.set_document_loader(jsonld.aiohttp_document_loader(timeout=...))`\r\n - https://github.com/digitalbazaar/pyld/tree/master/lib/pyld/documentloader\r\n - https://github.com/digitalbazaar/pyld/blob/master/lib/pyld/documentloader/aiohttp.py\r\n - We can write a document loader that similar to our `serviceEndpoint` work,\r\n encodes the system context to a string.\r\n - The shim (loader) might parse that and based on the context (parsing\r\n json-ld) determine that a URL is a dataflow which says to fetch the\r\n resource.\r\n- https://gitlab.alpinelinux.org/alpine/ca-certificates/-/blob/8ccb7c2c2672966030af65dc135890d636c576d1/Makefile#L31\r\n\r\n### Validating QEMU Packer build boots and can execute Alice CLI from `/wolfi` chroot\r\n\r\n- References\r\n - https://www.packer.io/plugins/builders/qemu\r\n - https://docs.fedoraproject.org/en-US/fedora/latest/install-guide/appendixes/Kickstart_Syntax_Reference/#sect-kickstart-commands-sshpw\r\n - https://www.packer.io/community-tools#templates\r\n - https://github.com/boxcutter/fedora\r\n - No strong signs of maintenance but, packer APIs are stable,\r\n and templates provided are pinned to versions.\r\n - https://github.com/boxcutter/fedora/blob/6e5fccff745f4ce7b2951ab6d19cd960f61be32d/fedora29-ws.json\r\n - https://github.com/boxcutter/fedora/blob/main/http/ks-fedora29-ws.cfg\r\n - https://github.com/boxcutter/fedora/blob/6e5fccff745f4ce7b2951ab6d19cd960f61be32d/fedora29-server.json\r\n - https://github.com/boxcutter/fedora/blob/main/http/ks-fedora29-server.cfg\r\n - https://github.com/boxcutter/fedora/blob/6e5fccff745f4ce7b2951ab6d19cd960f61be32d/script/sshd.sh\r\n - https://github.com/boxcutter/fedora/blob/main/LICENSE\r\n - https://alt.fedoraproject.org/cloud/\r\n\r\n```console\r\npdxjohnny@fedora-s-4vcpu-8gb-sfo3-01 ~ $ curl -fLOC - https://download.fedoraproject.org/pub/fedora/linux/releases/36/Cloud/x86_64/images/Fedora-Cloud-Base-36-1.5.x86_64.qcow2\r\n % Total % Received % Xferd Average Speed Time Time Time Current\r\n Dload Upload Total Spent Left Speed\r\n 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r\n100 427M 100 427M 0 0 268M 0 0:00:01 0:00:01 --:--:-- 355M\r\npdxjohnny@fedora-s-4vcpu-8gb-sfo3-01 ~ $ sha256sum Fedora-Cloud-Base-36-1.5.x86_64.qcow2\r\nca9e514cc2f4a7a0188e7c68af60eb4e573d2e6850cc65b464697223f46b4605 Fedora-Cloud-Base-36-1.5.x86_64.qcow2\r\n````\r\n\r\n- Added Fedora 36 Cloud support to boxcutter Fedora packer templates\r\n\r\n```console\r\npdxjohnny@fedora-s-4vcpu-8gb-sfo3-01 boxcutter-fedora $ git log -n 1\r\ncommit 6e5fccff745f4ce7b2951ab6d19cd960f61be32d (HEAD -> main, origin/main, origin/HEAD)\r\nAuthor: Mischa Taylor <57647141+taylorific@users.noreply.github.com>\r\nDate: Fri May 28 07:21:41 2021 -0700\r\n\r\n Update README.md\r\n```\r\n\r\n```diff\r\ndiff --git a/fedora.json b/fedora.json\r\nindex 851882f..20b7f62 100644\r\n--- a/fedora.json\r\n+++ b/fedora.json\r\n@@ -1,6 +1,33 @@\r\n {\r\n \"_command\": \"Build with `packer build fedora.json`\",\r\n \"builders\": [\r\n+ {\r\n+ \"boot_command\": [\r\n+ \"<tab> linux inst.text biosdevname=0 inst.ks=http://{{ .HTTPIP }}:{{ .HTTPPort}}/{{ user `kickstart` }}<enter><enter>\"\r\n+ ],\r\n+ \"boot_wait\": \"10s\",\r\n+ \"disk_size\": \"{{ user `disk_size` }}\",\r\n+ \"http_directory\": \"http\",\r\n+ \"iso_checksum\": \"{{ user `iso_checksum` }}\",\r\n+ \"iso_urls\": [\r\n+ \"{{ user `iso_path` }}/{{ user `iso_name` }}\",\r\n+ \"{{ user `iso_url` }}\"\r\n+ ],\r\n+ \"shutdown_command\": \"{{ user `shutdown_command` }}\",\r\n+ \"ssh_password\": \"{{ user `ssh_password` }}\",\r\n+ \"ssh_username\": \"{{ user `ssh_username` }}\",\r\n+ \"ssh_timeout\": \"10000s\",\r\n+ \"type\": \"qemu\",\r\n+ \"output_directory\": \"output_fedora_{{ user `vm_name` }}\",\r\n+ \"format\": \"qcow2\",\r\n+ \"accelerator\": \"kvm\",\r\n+ \"net_device\": \"virtio-net\",\r\n+ \"disk_interface\": \"virtio\",\r\n+ \"headless\": true,\r\n+ \"vm_name\": \"{{ user `vm_name` }}\",\r\n+ \"memory\": \"{{ user `memory` }}\",\r\n+ \"cpus\": \"{{ user `cpus` }}\"\r\n+ },\r\n {\r\n \"boot_command\": [\r\n \"<tab> linux text biosdevname=0 ks=http://{{ .HTTPIP }}:{{ .HTTPPort}}/{{ user `kickstart` }}<enter><enter>\"\r\n@@ -10,7 +37,6 @@\r\n \"headless\": \"{{ user `headless` }}\",\r\n \"http_directory\": \"http\",\r\n \"iso_checksum\": \"{{ user `iso_checksum` }}\",\r\n- \"iso_checksum_type\": \"{{ user `iso_checksum_type` }}\",\r\n \"iso_urls\": [\r\n \"{{ user `iso_path` }}/{{ user `iso_name` }}\",\r\n \"{{ user `iso_url` }}\"\r\n@@ -37,7 +63,6 @@\r\n \"headless\": \"{{ user `headless` }}\",\r\n \"http_directory\": \"http\",\r\n \"iso_checksum\": \"{{ user `iso_checksum` }}\",\r\n- \"iso_checksum_type\": \"{{ user `iso_checksum_type` }}\",\r\n \"iso_urls\": [\r\n \"{{ user `iso_path` }}/{{ user `iso_name` }}\",\r\n \"{{ user `iso_url` }}\"\r\n@@ -66,7 +91,6 @@\r\n \"guest_os_type\": \"{{ user `parallels_guest_os_type` }}\",\r\n \"http_directory\": \"http\",\r\n \"iso_checksum\": \"{{ user `iso_checksum` }}\",\r\n- \"iso_checksum_type\": \"{{ user `iso_checksum_type` }}\",\r\n \"iso_urls\": [\r\n \"{{ user `iso_path` }}/{{ user `iso_name` }}\",\r\n \"{{ user `iso_url` }}\"\r\ndiff --git a/fedora36-server.json b/fedora36-server.json\r\nnew file mode 100644\r\nindex 0000000..e0c506c\r\n--- /dev/null\r\n+++ b/fedora36-server.json\r\n@@ -0,0 +1,12 @@\r\n+{\r\n+ \"_comment\": \"Build with `packer build -var-file=fedora36-server.json fedora.json`\",\r\n+ \"vm_name\": \"fedora36-server\",\r\n+ \"cpus\": \"1\",\r\n+ \"disk_size\": \"65536\",\r\n+ \"iso_checksum\": \"421c4c6e23d72e4669a55e7710562287ecd9308b3d314329960f586b89ccca19\",\r\n+ \"iso_name\": \"Fedora-Server-netinst-x86_64-36-1.5.iso\",\r\n+ \"iso_url\": \"https://forksystems.mm.fcix.net/fedora/linux/releases/36/Server/x86_64/iso/Fedora-Server-netinst-x86_64-36-1.5.iso\",\r\n+ \"kickstart\": \"ks-fedora36-server.cfg\",\r\n+ \"memory\": \"2048\",\r\n+ \"update\": \"true\"\r\n+}\r\ndiff --git a/script/sshd.sh b/script/sshd.sh\r\nindex 0d75547..5a5cae2 100644\r\n--- a/script/sshd.sh\r\n+++ b/script/sshd.sh\r\n@@ -6,3 +6,13 @@ echo \"==> Turning off sshd DNS lookup to prevent timeout delay\"\r\n echo \"UseDNS no\" >> /etc/ssh/sshd_config\r\n echo \"==> Disabling GSSAPI authentication to prevent timeout delay\"\r\n echo \"GSSAPIAuthentication no\" >> /etc/ssh/sshd_config\r\n+\r\n+echo \"==> Downloading DecentrAlice sshd banner\"\r\n+# TODO(security) Don't run curl as root\r\n+curl -fLo /etc/ssh/sshd_banner https://gist.github.com/pdxjohnny/5f358e749181fac74a750a3d00a74b9e/raw/42d3d810948fd3326c36dd33d7ebc668b61e0642/sshd_banner\r\n+sha256sum -c - <<<'8ac49ba9114076b59d95b62308adcee046d997e9572f565dcebc97f4e8d6e219 /etc/ssh/sshd_banner' || rm -f /etc/ssh/sshd_banner\r\n+echo \"==> Enabling OS DecentrAlice sshd banner\"\r\n+echo \"Banner /etc/ssh/sshd_banner\" >> /etc/ssh/sshd_config\r\n+\r\n+echo \"==> Enabling Chroot Directory for Wolfi based OS DecentrAlice\"\r\n+echo \"ChrootDirectory /wolfi\" >> /etc/ssh/sshd_config\r\n```\r\n\r\n- It's hung\r\n - https://phoenixnap.com/kb/ssh-port-forwarding\r\n\r\n```console\r\n$ ssh -nNT -L 5900:127.0.0.1:5966 -i ~/.ssh/nahdig -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o PasswordAuthentication=no $USER@143.110.152.152\r\n```\r\n\r\n![image](https://user-images.githubusercontent.com/5950433/196511748-f85944ee-477c-467a-b194-8995c5d2b1e3.png)\r\n\r\n- Found out `ks` is invalid, unused in new versions of fedora\r\n - https://cobbler.github.io/\r\n - https://docs.fedoraproject.org/en-US/fedora/latest/install-guide/advanced/Kickstart_Installations/\r\n - https://docs.fedoraproject.org/en-US/fedora/latest/install-guide/advanced/Network_based_Installations/\r\n - https://duckduckgo.com/?q=ks+is+deprecated+and+has+been+removed&ia=web\r\n - https://bugzilla.redhat.com/show_bug.cgi?id=1907566\r\n - https://github.com/beaker-project/beaker/issues/83\r\n - https://access.redhat.com/documentation/en-us/red_hat_enterprise_linux/7/html/installation_guide/chap-anaconda-boot-options#sect-boot-options-deprecated-removed\r\n\r\n![image](https://user-images.githubusercontent.com/5950433/196513493-f01d8d90-2e55-4fa8-b754-bfb2109bf5f6.png)\r\n\r\n- Okay we got a new error: `auth has been removed`\r\n\r\n![image](https://user-images.githubusercontent.com/5950433/196519789-6d100c33-4caa-41a8-9eff-058eefc07444.png)\r\n\r\n- Then we got: `install has been removed`\r\n- https://github.com/hashicorp/packer-plugin-qemu\r\n - https://github.com/hashicorp/packer-plugin-qemu/blob/main/builder/qemu/step_create_vtpm.go\r\n\r\n![image](https://user-images.githubusercontent.com/5950433/196523459-01b0c593-fc61-46fb-bf97-0bf1b3fec586.png)\r\n\r\n- `$ journalctl -xeu anaconda`\r\n\r\n![image](https://user-images.githubusercontent.com/5950433/196544356-369d576e-0cb2-40cf-b6f7-588e995e84ee.png)\r\n\r\n![image](https://user-images.githubusercontent.com/5950433/196546301-1e2e743d-3c4e-487b-bd29-cd36dc0d4120.png)\r\n\r\n```mermaid\r\ngraph TD\r\n subgraph osdecentralice\r\n dwn[SSI Service DWN]\r\n end\r\n subgraph did_alice[did:alice]\r\n serviceEndpoint[serviceEndpoint:serviceendpoint.alice.did.chadig.com]\r\n content_addressable_storage[Container Registry With Layers from Data Flow static or dynamic]\r\n end\r\n```\r\n\r\n- TODO\r\n - [ ] Update Manifest ADR / docs with JSON-LD learnings / make it included\r\n - [ ] Update shim with JSON-LD learnings / make it included\r\n - [ ] Explore https://github.com/arcmags/ramroot"
}
]
},
{
"body": "# 2022-10-19 Engineering Logs",
"replies": [
{
"body": "## 2022-10-19 @pdxjohnny Engineering Logs\r\n\r\n- https://twitter.com/Buntworthy/status/1582307817884889088\r\n - > Got Imagic running with Stable Diffusion, it's super easy to implement, will share a notebook soon! Left: Input image, Right: Edited \"A photo of Barack Obama smiling big grin\"\r\n- https://twitter.com/prla/status/1582311844269543424\r\n- https://twitter.com/krol_valencia/status/1582727276709679104\r\n - > Do you need Sbom, Sarif or vulnerability format? [#trivy](https://mobile.twitter.com/hashtag/trivy?src=hashtag_click)\r\n > - trivy image \u2014format table alpine:3.10\r\n > - trivy image \u2014format cyclonedx alpine:3.10\r\n > - trivy image --format spdx-json alpine:3.10\r\n > - trivy image --format sarif alpine:3.10\r\n > - trivy image --format cosign-vuln alpine:3.10\r\n- https://twitter.com/PrateekJainDev/status/1582717688652398592\r\n - > ![DED1BDCC-E701-4275-A218-575AAC3DF3FC](https://user-images.githubusercontent.com/5950433/196858876-b9c04512-2105-45fd-beb9-b04d2ae04816.jpeg)\r\n- graph markov neural networks site:github.com offline rl\r\n - Terminal feedback loop, basic sysadmin stuff to start\r\n - https://github.com/ipld/js-dag-pb\r\n - https://github.com/ipld/js-dag-cbor\r\n - https://github.com/libp2p/js-libp2p-webrtc-star\r\n- https://dweb.archive.org/details/home\r\n- https://github.com/ipfs/js-ipfs/blob/master/docs/CONFIG.md\r\n - https://github.com/ipfs/js-ipfs/blob/master/docs/CONFIG.md#webrtcstar\r\n - https://github.com/libp2p/js-libp2p-floodsub\r\n - https://github.com/ipfs/js-ipfs/search?q=%3Aerror+TypeError%3A+fetch+failed&type=issues\r\n - https://github.com/ipfs/js-ipfs/issues/1481#issuecomment-410680460\r\n - https://github.com/multiformats/multiaddr/\r\n - https://github.com/ipfs/specs/blob/main/http-gateways/PATH_GATEWAY.md\r\n - https://github.com/ipfs/specs/blob/main/http-gateways/TRUSTLESS_GATEWAY.md\r\n\r\n**init_config.json**\r\n\r\n```json\r\n{\r\n \"Gateway\": {\r\n \"HTTPHeaders\": {\r\n \"Access-Control-Allow-Origin\": [\r\n \"http://pdxjohnny.devbox.nahdig.com:3000\"\r\n ]\r\n }\r\n },\r\n \"Addresses\": {\r\n \"API\": \"/ip4/0.0.0.0/tcp/5001\",\r\n \"Gateway\": \"/ip4/0.0.0.0/tcp/8080\"\r\n }\r\n}\r\n```\r\n\r\n```console\r\n$ vim node_modules/ipfs-http-server/src/index.js\r\n$ rm -rf /home/pdxjohnny/.jsipfs; DEBUG=ipfs:* ./node_modules/.bin/jsipfs daemon --enable-preload --init-profile server --init-config init_config.json 2>&1 | tee output.ipfs.daemon.$(date -Iseconds).txt\r\n...\r\nconfig\r\n{\r\n Addresses: { API: 'http://0.0.0.0' },\r\n Discovery: {\r\n MDNS: { Enabled: true, Interval: 10 },\r\n webRTCStar: { Enabled: true }\r\n },\r\n Bootstrap: [],\r\n Pubsub: { Router: 'gossipsub', Enabled: true },\r\n Swarm: {\r\n ConnMgr: { LowWater: 50, HighWater: 200 },\r\n DisableNatPortMap: false\r\n },\r\n Routing: { Type: 'dhtclient' },\r\n Identity: {\r\n PeerID: '12D3KooWRunqtKfjPSHsF24iPdrxVQ2gnhBNtBMBKsz6zj6KoXTR',\r\n PrivKey: 'CAESQKlBi28qNtDDVusw/NmEUKEWQ+ZyfYto5ewCb4EtX2KW7x7LeH/arjGtMo8RRl8ydw0UU9uUlLKSJHA8zDS4PqQ='\r\n },\r\n Datastore: { Spec: { type: 'mount', mounts: [Array] } },\r\n Keychain: {\r\n DEK: {\r\n keyLength: 64,\r\n iterationCount: 10000,\r\n salt: 'vTamkostN5h+m+yAbevZDaF6',\r\n hash: 'sha2-512'\r\n }\r\n },\r\n Addressess: [ { info: [Object] } ]\r\n}\r\nheaders\r\n{}\r\napiAddrs\r\nhttp://0.0.0.0\r\n[1666206773378] INFO (3881696 on fedora-s-4vcpu-8gb-sfo3-01): server started\r\n created: 1666206773187\r\n started: 1666206773376\r\n host: \"0.0.0.0\"\r\n port: 43943\r\n protocol: \"http\"\r\n id: \"fedora-s-4vcpu-8gb-sfo3-01:3881696:l9g0hqdf\"\r\n uri: \"http://0.0.0.0:43943\"\r\n address: \"0.0.0.0\"\r\n2022-10-19T19:12:53.448Z ipfs:http-api started\r\n2022-10-19T19:12:53.448Z ipfs:http-gateway starting\r\n2022-10-19T19:12:53.450Z ipfs:http-gateway started\r\n2022-10-19T19:12:53.452Z ipfs:daemon started\r\njs-ipfs version: 0.16.1\r\nHTTP API listening on /ip4/0.0.0.0/tcp/43943/http\r\nWeb UI available at http://0.0.0.0:43943/webui\r\nDaemon is ready\r\n```\r\n\r\n- Switching to Golang based IPFS implementation\r\n - https://github.com/ipfs/kubo\r\n - https://dweb.link/ipns/dist.ipfs.tech#kubo\r\n - https://docs.ipfs.tech/how-to/address-ipfs-on-web/#subdomain-gateway\r\n- https://docs.ipfs.tech/how-to/command-line-quick-start/#take-your-node-online\r\n\r\n```console\r\n$ mkdir -p ~/.local\r\n$ echo -e 'export PATH=\"${PATH}:${HOME}/.local/kubo\"' | tee -a ~/.bashrc ~/.bash_profile\r\n$ source ~/.bashrc\r\n$ curl -sfL https://dist.ipfs.tech/kubo/v0.16.0/kubo_v0.16.0_linux-amd64.tar.gz | tar -C ~/.local -vxz\r\n$ ipfs init --profile server\r\n$ ipfs config Addresses.Gateway /ip4/0.0.0.0/tcp/8080\r\n```\r\n\r\n- http://pdxjohnny.devbox.nahdig.com:8080/ipfs/QmQ58yAN4oMsCZwhpHhfWPiFtBgSyxoVn2PFncnpuf5cBX\r\n - `I <3 IPFS -pdxjohnny`\r\n - SECURITY Gateway server is not supposed to be exposed\r\n\r\n```\r\ncreate:1 Access to XMLHttpRequest at 'http://pdxjohnny.devbox.nahdig.com:5001/api/v0/add?stream-channels=true&progress=false' from origin 'http://pdxjohnny.devbox.nahdig.com:3000' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource.\r\nfetch.browser.js?273a:91 POST http://pdxjohnny.devbox.nahdig.com:5001/api/v0/add?stream-channels=true&progress=false net::ERR_FAILED 403\r\n```\r\n\r\n```console\r\n$ ipfs config --help\r\n$ ipfs daemon --help\r\n$ ipfs config --json API.HTTPHeaders.Access-Control-Allow-Origin \"[\\\"http://pdxjohnny.devbox.nahdig.com:3000\\\"]\"\r\n$ ipfs config --json API.HTTPHeaders.Access-Control-Allow-Methods \"[\\\"PUT\\\", \\\"GET\\\", \\\"POST\\\"]\"\r\n$ ipfs config --json API.HTTPHeaders.Access-Control-Allow-Credentials \"[\\\"true\\\"]\"\r\n$ ipfs daemon\r\n$ curl 'http://pdxjohnny.devbox.nahdig.com:5001/api/v0/add?stream-channels=true&progress=false' \\\r\n -H 'Accept: */*' \\\r\n -H 'Accept-Language: en-US,en;q=0.9' \\\r\n -H 'Connection: keep-alive' \\\r\n -H 'Origin: http://pdxjohnny.devbox.nahdig.com:3000' \\\r\n -H 'Referer: http://pdxjohnny.devbox.nahdig.com:3000/' \\\r\n -H 'User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/106.0.0.0 Safari/537.36' \\\r\n -H 'content-type: multipart/form-data; boundary=-----------------------------eWfTjhbnBpWxbCcBUUJEX' \\\r\n --data-raw $'-------------------------------eWfTjhbnBpWxbCcBUUJEX\\r\\nContent-Disposition: form-data; name=\"file\"; filename=\"\"\\r\\nContent-Type: application/octet-stream\\r\\n\\r\\nFILE_DATA\\r\\n-------------------------------eWfTjhbnBpWxbCcBUUJEX--\\r\\n' \\\r\n --compressed \\\r\n --insecure\r\n```\r\n\r\n- Try building static didme.me site and deploying from that\r\n - https://nextjs.org/docs/api-reference/cli#production\r\n\r\n```console\r\n$ npm install\r\n$ NODE_OPTIONS=--openssl-legacy-provider npx next build\r\n$ npx next start -p 3000\r\nTypeError: Bolt URL expected to be string but was: undefined\r\n$ git log -n 1\r\ncommit 14da8e47d8a1a4bef3cc1c85968c9f8b6963d269 (HEAD -> main, origin/main, origin/HEAD)\r\nAuthor: Orie Steele <orie@transmute.industries>\r\nDate: Sun Jul 3 11:18:36 2022 -0500\r\n\r\n feat: ui/ux\r\n```\r\n\r\n```diff\r\ndiff --git a/core/NFT/NFT.ts b/core/NFT/NFT.ts\r\nindex 054d14c..eae5e76 100644\r\n--- a/core/NFT/NFT.ts\r\n+++ b/core/NFT/NFT.ts\r\n@@ -18,6 +18,11 @@ export const getContract = async (web3: any) => {\r\n };\r\n\r\n export const getHistory = async (did: string) => {\r\n+ return {\r\n+ count: 0,\r\n+ items: [],\r\n+ };\r\n+\r\n const {\r\n NEO4J_CONNECTION,\r\n NEO4J_USERNAME,\r\ndiff --git a/core/ipfs.ts b/core/ipfs.ts\r\nindex 44722cf..a6f8f40 100644\r\n--- a/core/ipfs.ts\r\n+++ b/core/ipfs.ts\r\n@@ -4,28 +4,20 @@ const { urlSource } = ipfsHttpClient;\r\n const ipfsApis = [\r\n {\r\n label: \"localhost\",\r\n- url: \"http://localhost:5001\",\r\n- },\r\n- {\r\n- label: \"infura\",\r\n- url: \"https://ipfs.infura.io:5001\",\r\n+ url: \"http://pdxjohnny.devbox.nahdig.com:5001\",\r\n },\r\n ];\r\n\r\n const ipfsGateways = [\r\n {\r\n label: \"localhost\",\r\n- url: \"http://localhost:8080\",\r\n- },\r\n- {\r\n- label: \"infura\",\r\n- url: \"https://ipfs.infura.io\",\r\n+ url: \"http://pdxjohnny.devbox.nahdig.com:8080\",\r\n },\r\n ];\r\n\r\n-const ipfsApi = ipfsApis[1].url;\r\n+const ipfsApi = ipfsApis[0].url;\r\n\r\n-const ipfsGateway = ipfsGateways[1].url;\r\n+const ipfsGateway = ipfsGateways[0].url;\r\n\r\n const client = ipfsHttpClient({\r\n // url: \"https://ipfs.infura.io:5001\",\r\n```\r\n\r\n```console\r\n$ python -c 'import sys, json, yaml; print(yaml.dump(json.loads(sys.stdin.read())))'\r\n{\"didDocument\":{\"@context\":[\"https://www.w3.org/ns/did/v1\",\"https://w3id.org/security/suites/jws-2020/v1\"],\"id\":\"did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6\",\"verificationMethod\":[{\"id\":\"did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6#zQ3shrnCZq3R7vLvDeWQFnxz5HMKqP9JoiMonzYJB4TGYnftL\",\"type\":\"JsonWebKey2020\",\"controller\":\"did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6\",\"publicKeyJwk\":{\"kty\":\"EC\",\"crv\":\"secp256k1\",\"x\":\"tF8KQenSP2vPS3u-D5oLxwHOZEpSBcujQqGrysimK1E\",\"y\":\"ZZB_Q4oHp3hboXCKYA_c5qEByYKAj2wXC9Rql6LO478\"}}],\"assertionMethod\":[\"did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6#zQ3shrnCZq3R7vLvDeWQFnxz5HMKqP9JoiMonzYJB4TGYnftL\"],\"authentication\":[\"did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6#zQ3shrnCZq3R7vLvDeWQFnxz5HMKqP9JoiMonzYJB4TGYnftL\"],\"capabilityInvocation\":[\"did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6#zQ3shrnCZq3R7vLvDeWQFnxz5HMKqP9JoiMonzYJB4TGYnftL\"],\"capabilityDelegation\":[\"did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6#zQ3shrnCZq3R7vLvDeWQFnxz5HMKqP9JoiMonzYJB4TGYnftL\"],\"keyAgreement\":[\"did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6#zQ3shrnCZq3R7vLvDeWQFnxz5HMKqP9JoiMonzYJB4TGYnftL\"]},\"didResolutionMetadata\":{\"didUrl\":{\"did\":\"did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6\",\"methodName\":\"meme\",\"methodSpecificId\":\"1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6\"}},\"didDocumentMetadata\":{\"image\":\"http://pdxjohnny.devbox.nahdig.com:8080/ipfs/QmSDfug9jdkErKFvE1YHw44yestkppV92ae2qd4EuYHQxJ\",\"ethereum\":{\"address\":\"0x30bB6577432a20d46b29Bd196997a8BA6b97C71b\"},\"bitcoin\":{\"address\":\"mh54xLL62pt5VXKmivS2JYBcv4qNWHJPPo\"}}}\r\n```\r\n\r\n```yaml\r\ndidDocument:\r\n '@context':\r\n - https://www.w3.org/ns/did/v1\r\n - https://w3id.org/security/suites/jws-2020/v1\r\n assertionMethod:\r\n - did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6#zQ3shrnCZq3R7vLvDeWQFnxz5HMKqP9JoiMonzYJB4TGYnftL\r\n authentication:\r\n - did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6#zQ3shrnCZq3R7vLvDeWQFnxz5HMKqP9JoiMonzYJB4TGYnftL\r\n capabilityDelegation:\r\n - did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6#zQ3shrnCZq3R7vLvDeWQFnxz5HMKqP9JoiMonzYJB4TGYnftL\r\n capabilityInvocation:\r\n - did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6#zQ3shrnCZq3R7vLvDeWQFnxz5HMKqP9JoiMonzYJB4TGYnftL\r\n id: did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6\r\n keyAgreement:\r\n - did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6#zQ3shrnCZq3R7vLvDeWQFnxz5HMKqP9JoiMonzYJB4TGYnftL\r\n verificationMethod:\r\n - controller: did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6\r\n id: did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6#zQ3shrnCZq3R7vLvDeWQFnxz5HMKqP9JoiMonzYJB4TGYnftL\r\n publicKeyJwk:\r\n crv: secp256k1\r\n kty: EC\r\n x: tF8KQenSP2vPS3u-D5oLxwHOZEpSBcujQqGrysimK1E\r\n y: ZZB_Q4oHp3hboXCKYA_c5qEByYKAj2wXC9Rql6LO478\r\n type: JsonWebKey2020\r\ndidDocumentMetadata:\r\n bitcoin:\r\n address: mh54xLL62pt5VXKmivS2JYBcv4qNWHJPPo\r\n ethereum:\r\n address: '0x30bB6577432a20d46b29Bd196997a8BA6b97C71b'\r\n image: http://pdxjohnny.devbox.nahdig.com:8080/ipfs/QmSDfug9jdkErKFvE1YHw44yestkppV92ae2qd4EuYHQxJ\r\ndidResolutionMetadata:\r\n didUrl:\r\n did: did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6\r\n methodName: meme\r\n methodSpecificId: 1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6\r\n```\r\n\r\n- 2022-04-17: f9d083fc0c99737f131601c1893b79a2c2907f2aa2a4bbe71ea3e4c237f8a51a\r\n- fulcio issue DID (key)?\r\n - https://github.com/sigstore/fulcio/search?q=did\r\n - https://github.com/sigstore/fulcio/blob/fac62ed5e8fc7f4efa40c29ab8e1a5f1552f14bd/pkg/ca/tinkca/signer_test.go#L118\r\n - https://github.com/sigstore/fulcio/blob/fac62ed5e8fc7f4efa40c29ab8e1a5f1552f14bd/pkg/ca/tinkca/signer.go\r\n - https://github.com/sigstore/fulcio/blob/fac62ed5e8fc7f4efa40c29ab8e1a5f1552f14bd/pkg/ca/tinkca/signer.go#L46-L88\r\n - `new(ecdsapb.EcdsaPrivateKey)`\r\n - `new(ed25519pb.Ed25519PrivateKey)`\r\n - `ed25519.NewKeyFromSeed(privKey.GetKeyValue())`\r\n - https://github.com/intel/dffml/blob/alice/docs/arch/0007-A-GitHub-Public-Bey-and-TPM-Based-Supply-Chain-Security-Mitigation-Option.rst\r\n - https://twitter.com/pdxjohnny/status/1524535483396632576\r\n - https://twitter.com/pdxjohnny/status/1524870665764909056?s=20&t=z12dn9tVREZzK7huX6hsSg\r\n - By having fulcio also issue a DID for the attestation we can create dyanmic roots of trust associated with each manifest bom item queried later (at time of use)\r\n - We can export the public portion of the ephemeral DID key from fulcio and then use the DID key based method of verification of the doc contents offline / later\r\n - This also means it's easy to swap out BOM components, because we just swap out the key and did we verify against.\r\n- Clicking around again\r\n\r\n![image](https://user-images.githubusercontent.com/5950433/196825338-ad4f6933-8ee0-438d-911e-cb09aebe6c5f.png)\r\n\r\n> ```console\r\n> $ gh repo clone memes || gh repo create memes --template https://github.com/OR13/did-web-github-did-meme --public --clone\r\n> $ cd memes && ./scripts/install.sh > did:meme:1zgsrnfgfe52zm0tgy4rgj0y5a3lnghmqduyv3yn8uw6tchfpzmxywuch7lza6\r\n> ```\r\n\r\n- https://or13.github.io/didme.me/did-method-spec.html\r\n - https://or13.github.io/didme.me/#using-github-pages\r\n - https://github.com/OR13/did-web-github-did-meme\r\n - https://identity.foundation/didcomm-messaging/spec/#out-of-band-messages\r\n- Auth to fulcio issues Verifiable Credential\r\n- Why are we doing this?\r\n - We want to not do risky things! risky things in this context are executions of system context which have negative impacts on strategic principles\r\n - We want to build Alice to be resilient to the open network\r\n - markov chain graph neural networks / offline rl\r\n - Trying to estimate what data to use, active learning, actively reevaluating chain of trust as they factor into the overall decision making process (gatekeeper and prioritizer)\r\n - We will issue DIDs and store provenance as VCs\r\n - This will allow us to trace provenance\r\n - We can then simulate good data / bad data situations\r\n - We will hopefully end up with models that develop strong security posture, i.e. are risk averse and good at getting the job done\r\n- Just do the same thing with metric data instead of a meme! Duh\u2026\r\n- So for serialization we tranform the uuids on the inputs to their dids woth content uplod to digital ocean space and ipfs\r\n- https://identity.foundation/keri/did_methods/\r\n- https://or13.github.io/didme.me/did-method-spec.html\r\n - Let's try to modify this to use KERI DID method spec in place of DID key method spec\r\n\r\n> ## DID Method Specification\r\n>\r\n> did:meme is a deterministic transformation of did:key, that uses IPFS, image content and bech32.\r\n>\r\n> ### DID Format\r\n>\r\n> ```\r\n> did-meme-format := did:meme:<bech32-value>\r\n> bech32-value := [a-zA-HJ-NP-Z0-9]+\r\n> ```\r\n>\r\n> The `bech32-value` is an encoded [multihash](https://multiformats.io/multihash/).\r\n>\r\n> The `multihash` is a content identifier for an image.\r\n>\r\n> The image contains a steganographically embedded `did:key`.\r\n>\r\n> See [did-key](https://w3c-ccg.github.io/did-method-key/#format).\r\n>\r\n> Another way of representing the `did:meme` identifier encoding:\r\n>\r\n> ```\r\n> did:meme:<bech32(\r\n> multihash(\r\n> stego-embed(image, did:key)\r\n> )\r\n> )>\r\n> ```\r\n>\r\n> ### DID Operations\r\n>\r\n> See [did-key](https://w3c-ccg.github.io/did-method-key/#operations).\r\n>\r\n> #### Create\r\n>\r\n> - Generate a did:key\r\n> - Steganographically embed the public key multicodec representation in a meme.\r\n> - Upload the meme to ipfs.\r\n> - Transform the CID to a did:meme with bech32.\r\n> - Update the did document to use the did:meme identifier.\r\n>\r\n> #### Read\r\n>\r\n> - Convert the bech32 id to an ipfs CID.\r\n> - Resolve the image.\r\n> - Extract the did:key multicodec.\r\n> - Construct the did:key document from the identifier.\r\n> - Update the did document to use the did:meme identifier.\r\n>\r\n> #### Update\r\n>\r\n> Not supported.\r\n>\r\n> #### Deactivate\r\n>\r\n> Not supported.\r\n>\r\n> ### Security and Privacy Considerations\r\n>\r\n> See [did-key](https://w3c-ccg.github.io/did-method-key/#security-and-privacy-considerations)\r\n>\r\n> #### Security\r\n>\r\n> Because update and deactivate are not supported, did:meme should only be used for very short lived interactions, or just lulz.\r\n>\r\n> Because did:meme identifiers are a super set of did:key, it is possible for multiple did:meme to map to the same did:key\u2026 This can be problematic when private key compromise has occured.\r\n>\r\n> Generally speaking, did:meme has similar or weaker security properties compared with did:key.\r\n>\r\n> #### Privacy\r\n>\r\n>Be careful to strip XIF data or other meta data from images before constructing did:meme.\r\n>\r\n> Do not use images that identify physical locations or people.\r\n\r\n- Community depth of field analysis\r\n - https://github.com/bumblefudge\r\n - Seems to be decentralized space leader\r\n - https://github.com/decentralized-identity/didcomm-messaging\r\n - https://github.com/decentralized-identity/schema-directory\r\n - https://github.com/centrehq/verite\r\n - https://github.com/learningproof/learningproof.github.io\r\n\r\n---\r\n\r\nUnsent to Hector with the city of portland\u2019s open data effort.\r\nRelated: https://docs.google.com/document/d/1Ku6y50fY-ZktcUegeCnXLsksEWbaJZddZUxa9z1ehgY/edit\r\nRelated: https://github.com/intel/dffml/issues/1293\r\n\r\nHi Hector,\r\n\r\nI wanted to circle back with you and see if there was anything you were aware of community effort wise involving city data and (de)centralized post disaster coordination efforts?\r\n\r\nThank you,\r\nJohn"
}
]
},
{
"body": "# 2022-10-20 Engineering Logs",
"replies": [
{
"body": "## 2022-10-20 1:1 Orie/John\r\n\r\n- There was a woman talking about AI deployment provenance as o3con\r\n- Linked data politics\r\n - Verifiable credentials\r\n - Still seeing building off ridged data formats\r\n- JSON-LD is the primary microdata format\r\n - Query engines already do this\r\n - Label property graph\r\n- Linked data integrity\r\n - JSON-LD formatted verifiable credentials\r\n- How could we do something CBOR-LD like?\r\n - Unpack into SCITT will in interesting\r\n- https://github.com/microsoft/did-x509/blob/main/specification.md\r\n- Consistent ability to restructure the envelope on (de)serialize\r\n- Ideally when RFC is published those involved driving interoperability % test suite numbers up\r\n- https://protocol.ai"
},
{
"body": "- https://json-ld.org/playground/"
}
]
},
{
"body": "# 2022-10-21 Engineering Logs",
"replies": [
{
"body": "## 2022-10-21 @pdxjohnny Engineering Logs\r\n\r\n- (De)serialization\r\n - `did:merkle:`\r\n- Online cloning cuts our iteration time\r\n - Artificial Life Is Coming Eventually\r\n - Data flows are the parallel exploration of trains of thought (nested graphs)\r\n - Natural selection and evolution\r\n - Tree of life\r\n - Parallel exploration of nested graphs\r\n - Automated synchronization of system state across distinct timelines (distinct roots)\r\n - Enables the resolution of system state post haste, post state, and post date\r\n - See fuzzy finding later in this doc: find \"join disparate roots\"\r\n - This is effectively out of order execution at a higher level of abstraction, in the aggregate, so as to bring the aggregate set of agents involved to an equilibrium state\r\n - We are building the thought communication protocol, to communicate thought is to learn\r\n - If we can describe any architecture, any problem space, we can describe any thought\r\n - To describe a thought most completely, one must know how to best communicate with that entity\r\n - That entity, that agent, is a moving target for communication at it's optimal rate of learning.\r\n - It's past is relevant in determining it's future as it's past determines what will resonate best with it in terms of forming conceptual linkages.\r\n - Past doesn't have to be memory, data and compute are the same in our architecture\r\n - Hardwired responses get encoded the same way, it's all the signal, the probability\r\n - When Alice goes through the looking glass she'll take us with her in sprit, and come back to communicate to us how best to proceed, in every way.\r\n - The less (more?) rambling way of putting this would be, we need our AI to be a true to us extension of ourselves, or of our ad-hoc formed groups, they need to be true to those strategic principles we've communicated to the machine. If we can trust their transparency (estimates/forecasts and provenance on that) about their ability to stay aligned to those principles, then we can accurately assess operating risk and it's conformance to our threat model or any threat model the execution of the job fits within.\r\n - This means we can trust our AI to not influence us in the wrong ways.\r\n - This means we can trust it to influence us in the right ways, the ways we want to influence ourselves, or our software development lifecycle.\r\n - This assessment of the level of trust fundamentally comes from our analysis of our analysis of our software development lifecycle, our Entity Analysis Trinity.\r\n- https://github.com/OR13/did-jwk\r\n - https://github.com/OR13/did-jwk/blob/main/src/index.js#L158\r\n- https://wasmer.io/\r\n- https://oliverklingefjord.substack.com/p/pagerank-anthropology\r\n- https://github.com/decentralized-identity/universal-resolver/blob/main/docs/driver-development.md\r\n - Full demo would be `did:meme:` and [`did:jwk:`](https://twitter.com/OR13b/status/1583818675982782465) ~~and `did:keri:` hybrid~~ (will wait on `did:keri:` hybrid until after MVP) with resolver implemented which serves and fetches containers from registry, instead of JPEG, use container image format.\r\n - This demo allows us to show checks on provenance for execution\r\n - Could we also require Verifiable Credentials to resolve the DID?\r\n - We could combine with static analysis / SBOM and Open Policy Agent and threat modeling to implement AI alignment to strategic principles (as agreed in compute contract) checks.\r\n - What does this enable?\r\n - One can now reference and request fulfilment of any flow, any process, any routine, etc via a single pattern.\r\n - \ud83d\udc22\r\n - \ud83d\udc22\r\n - \ud83d\udc22\r\n- https://identity.foundation/did-registration/\r\n- Alice caught time traveling again\r\n - https://github.com/w3c-ccg/did.actor/commit/69144ab453447f682b20d8be13cd8293e888dd2f#diff-75f0c8d440957e0ea1c6945930d0ac946e85e3e324b59a8af8ed13a3918581f1R10\r\n - https://github.com/w3c-ccg/did.actor/commit/56d4f525f21b84696badc312f9654451911250f4#diff-75f0c8d440957e0ea1c6945930d0ac946e85e3e324b59a8af8ed13a3918581f1R10\r\n - https://github.com/w3c-ccg/did.actor/blob/3fe99eec616b71d7fc36c5603235eeac81c91652/bob/credentials/3732.json\r\n - https://github.com/w3c-ccg/did.actor/blob/3fe99eec616b71d7fc36c5603235eeac81c91652/alice/README.md\r\n - https://lucid.did.cards/identifiers/did:web:did.actor:alice\r\n- https://github.com/WebOfTrustInfo\r\n - https://github.com/WebOfTrustInfo/rwot11-the-hague/blob/master/draft-documents/verifiable-endorsements-from-linked-claims.md\r\n - > Further, we propose to demonstrate the ability to compose several LinkedClaims into a single domain-specific credential, specifically a Verifiable Endorsement, that will satisfy the domain requirements of the likely users.\r\n >\r\n > This approach will enable rich shared datasets to inform trust decisions, while satisfying the requirements of domain-specific end users. If time permits a sample score can be built over the linked claim dataset.\r\n - https://github.com/WebOfTrustInfo/rwot11-the-hague/blob/master/draft-documents/composable-credentials.md#standalone-claim---review\r\n - An event in our case (to start with) is data flow Input data, our cached data.\r\n - https://github.com/WebOfTrustInfo/rwot11-the-hague/blob/master/draft-documents/data-exchange-agreements-with-oca.md\r\n - https://github.com/WebOfTrustInfo/rwot11-the-hague/blob/master/draft-documents/data-exchange-agreements-with-oca.md#13-context-preservation---semantic-approach---the-overlays-capture-architecture-oca\r\n - Woohoo! Someone else defined overlays, now we don't have to :P\r\n - https://oca.colossi.network/\r\n - https://oca.colossi.network/guide/introduction.html#what-is-decentralised-semantics\r\n - > In the domain of decentralised semantics, task-specific objects are called \"Overlays\". They provide layers of definitional or contextual information to a stable base object called a \u201cCapture Base\u201d.\r\n- SCITT\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - https://mailarchive.ietf.org/arch/msg/scitt/NtBc7vfMm-zFKxguVfiGg-vGjHk/\r\n - VDR usage\r\n- https://github.com/WebOfTrustInfo/rwot11-the-hague/blob/master/draft-documents/did-merkle.md\r\n- Why do we like DIDs?\r\n - It is a primitive for a decentralized offline capable cryptographically secured linked list.\r\n - This allows us to join disparate roots (timelines, trees, metric data graphs) at a later time\r\n - Or to revaluate inclusion of those sets\r\n - Or to generate new datasets entirely\r\n - Or to run inference to get those datasets / trees\r\n - Or a hybrid approach\r\n - This will enable training Alice to be risk averse, aka training to be aligned with strategic principles.\r\n - [2022-10-19 @pdxjohnny Engineering Logs](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-3918361)\r\n - This will help Alice not waste time on unaligned trains of thought.\r\n - Our gatekeeper and prioritizer of course have final say, but this is to do the fuzzy filter logic on those.\r\n - https://github.com/pdxjohnny/pdxjohnny.github.io/blob/dev/content/posts/2022-03-02-did-twitter-space.md\r\n - https://github.com/SmithSamuelM/Papers/blob/master/whitepapers/quantum-secure-dids.pdf\r\n - https://github.com/SmithSamuelM/Papers/blob/master/whitepapers/SelfIdentifyingData.md\r\n - > The question this white-paper attempts to answer is how best to represent decentralized self-certifing self-identifying data. The main use case for this type of data are distributed (but with decentralized control) data intensive processing applications. Because data intensive applications are often limited by network and processing resources, economy of expression is an important consideration in a data representation schema. Thus there are trade-offs to be made in the design of the schema where economy of expression is a desirable feature.\r\n - > A decentralized self-identifying data item is identified by a decentralized universally unique self-certifying identifier (DID). Self certifying means that the identifier includes either a public key or a fingerprint of a public key from a cryptographic public/private key pair. The DID is included in the data item itself as the value of a field. The data item also includes a field whose value is the DID for the signer of the data item. This may or may not be the same DID used to identify the data item itself. Attached to the data item is a signature that is verifiable as being generated by the private key associated with the public key in the signer field's DID value. This signature verifies that the data item was created by the holder of the associated private key for the signer. The whole data item is both self-identifing and self-certifying because all identifiers are included in the signed data and are verifiable against the private keys associated with the public keys in the included DIDs.\r\n - This is exactly why we like DIDs\r\n - https://github.com/SmithSamuelM/Papers/blob/master/whitepapers/SelfIdentifyingData.md#data-canonicalization\r\n - https://github.com/SmithSamuelM/Papers/blob/master/whitepapers/SelfIdentifyingData.md#key-reproduction\r\n - https://github.com/SmithSamuelM/Papers/blob/master/whitepapers/A_DID_for_everything.pdf\r\n - Good background info on DIDs\r\n - > It should be noted that a single instance of meeting is not as trustable as an entire history of meeting many people. For a state actor generating a legend for a sockpuppet, this would entail an unattainable level of work to prove personhood. For a regular human being, it's relatively efortless to use the system in an organic and unobtrusive manner. Once a root personhood verifcation could be insured, then trustable pseudonyms could be generated. Adding this verifcation to DIDs would provide trust in a trustless environment, as the DID could then provide identity and credentialing services in environments that support, or even require, pseudonymity\r\n - > Data fows can be provenanced by verifying the end-to-end integrity of data with DIDs. By enabling DIDs to sign claims about other DIDs, the fidelity of these data fows can be increased further\r\n - Bingo\r\n - > Imagine a world where this proposed technology has been deployed and globally adopted. Let us paint a picture for how this might be achieved. Imagine that this approach becomes part of a decentralized identity solution for every entity, driven by a robust and active developer community. The vision is to generate technologies that would be integrated into applications that are used in IoT, e-commerce, social interaction, banking, healthcare, and so on. Now imagine that mobile telephony companies agree to embed the technology into the operating systems for all smartphones, and the dominant social network providers agree to use DIDs and DADs and proofs about the entities controlling these DIDs and DADs in their algorithms for determining which content to propel. This would mean the end of phishing. The end of fake news. This is the beginning of new era for society, built on an interconnecting web of trust: a world in which we know what impacts we are having. The emergent property of this new data fabric is Knowing.\r\n - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0004_writing_the_wave.md\r\n - > Underlying the benefts of decentralized identity outlined above is the need for open interoperable standards to ensure the reputable provenance of the associated data fows between decentralized entities. This paper describes a novel concept for provenancing data fows using DADis (Decentralized Autonomic Data items) that are built upon the emerging DID standard. This approach uses and extends the advanced difuse-trust or zero-trust computing paradigm that is needed to operate securely in a world of decentralized data.\r\n - https://github.com/transmute-industries/verifiable-actions\r\n - https://github.com/transmute-industries/verifiable-data\r\n - https://github.com/transmute-industries/verifiable-data/tree/main/packages/ed25519-signature-2018\r\n - https://github.com/digitalbazaar/jsonld-signatures\r\n - > The proof purpose indicates why the proof was created and what its intended use is. This information can also be used to make sure that the verificationMethod was authorized for the stated purpose in the proof. Using a proof purpose helps to encourage people to authorize certain cryptographic keys (verification methods) for explicit purposes rather than granting them ambient authority. This approach can help prevent people from accidentally signing documents for reasons they did not intend.\r\n - https://github.com/digitalbazaar/vc-js#custom-documentloader\r\n - Data flow integration opportunities\r\n - https://github.com/WebOfTrustInfo/rwot5-boston/blob/778ccf4c56319d31ea3d9baac8a27e2cbe6763ec/topics-and-advance-readings/verifiable-claims-primer.md\r\n - https://github.com/WebOfTrustInfo/rwot5-boston/blob/master/topics-and-advance-readings/did-primer.md\r\n- https://twitter.com/vdmbrsv/status/1583512490226647040/photo/1\r\n - https://github.com/kathrinse/be_great\r\n- https://github.com/microsoft/did-x509/blob/main/specification.md\r\n- https://didcomm.org/book/v2/\r\n- Need to analyze KERI interoperability ergonomics with rest of web5 ecosystem\r\n - How would tie in with OIDC GitHub Actions / sigstore work?\r\n - Does this enable crowdsourable DB via (confidential) ledgers as root of trust watchers?\r\n - Perfect forward secrecy please with that roll forward key thing\r\n - https://github.com/WebOfTrust/keripy\r\n - Have yet to see another solution with potential DID space interop.\r\n - Have to be sure before making any next steps.\r\n - Would be very nice for datatset/cache (de)serialization.\r\n - If it can be done cleanly, might as well play with it.\r\n - Try with `did:meme`\r\n - https://or13.github.io/didme.me/did-method-spec.html\r\n - https://or13.github.io/didme.me/#using-github-pages\r\n - [2022-10-19 @pdxjohnny Engineering Logs](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-3918361)\r\n - `did:oa:data:`\r\n - What used to be the meme data in the `did:meme:` is now our system context\r\n - https://github.com/w3c/did-spec-registries/compare/main...pdxjohnny:aliceoa?expand=1\r\n - `did:alice:`\r\n - Entry points for Alice the entity\r\n - https://packaging.python.org/en/latest/specifications/entry-points/\r\n - These are our `dffml.overlays.alice.please.contribute`\r\n - Upstream: `did:alice:please:contribute:<ID>`\r\n - Overlays: `did:alice:please:contribute:<ID>`\r\n - JSON-LD\r\n - Enables streaming query for applicable overlays\r\n - Decentralized Web Nodes\r\n - Enable data transfer of DID docs\r\n - For simplistic query, one can drop the `<ID>` portion of the DID\r\n - DWNs could then resolve all DIDs the operator (instantiated Operation Implementation Network) would like to make known to the requester as an advertisement of services\r\n - `did:alice:`\r\n - Resolves the base (data) flow, the upstream\r\n - Extracts the entry point from the DID doc\r\n - `did:oa:`\r\n - Ping Orie to ask for thoughts when done\r\n- How you are is how you will be\r\n- https://multiformats.io/multihash/\r\n - Shim-esq\r\n- https://identity.foundation/keri/did_methods/\r\n\r\n### Analysis of KERI interoperability ergonomics with rest of web5 ecosystem\r\n\r\n- References\r\n - https://github.com/WebOfTrust/keripy\r\n - https://github.com/WebOfTrust/keripy/blob/1b83ac4625b072c1f7c9f583c4dde85d5eb1cde8/setup.py#L100-L102\r\n - Notice anyone currently missing?\r\n - https://github.com/WebOfTrust/keripy/search?q=did\r\n - https://github.com/WebOfTrust/keripy/blob/303e45a1b293b544f7976fa2c56094172b3254b8/ref/Peer2PeerCredentials.md\r\n - https://github.com/WebOfTrust/keripy/blob/development/tests/peer/test_exchanging.py\r\n- https://github.com/decentralized-identity/keri/blob/master/kids/kid0009.md\r\n- https://weboftrust.github.io/did-keri/#create\r\n - https://identity.foundation/keri/docs/Glossary.html#inception-event\r\n - >![image](https://user-images.githubusercontent.com/5950433/197252695-488e3476-734d-4b3f-b551-b562674d89b2.png)\r\n >\r\n > The inception data must include the public key, the identifier derivation from that public key, and may include other configuration data. The identifier derivation may be simply represented by the derivation code. A statement that includes the inception data with attached signature made with the private key comprises a cryptographic commitment to the derivation and configuration of the identifier that may be cryptographically verified by any entity that receives it.\r\nA KERI inception statement is completely self-contained. No additional infrastructure is needed or more importantly must be trusted in order to verify the derivation and initial configuration (inception) of the identifier. The initial trust basis for the identifier is simply the signed inception statement.\r\n\r\n```console\r\n$ python -m pip install -U lmdb pysodium blake3 msgpack simplejson cbor2\r\nDefaulting to user installation because normal site-packages is not writeable\r\nCollecting lmdb\r\n Downloading lmdb-1.3.0-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (306 kB)\r\n \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 306.5/306.5 kB 11.0 MB/s eta 0:00:00\r\nCollecting pysodium\r\n Downloading pysodium-0.7.12.tar.gz (21 kB)\r\n Preparing metadata (setup.py) ... done\r\nCollecting blake3\r\n Downloading blake3-0.3.1-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.whl (1.1 MB)\r\n \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 1.1/1.1 MB 32.8 MB/s eta 0:00:00\r\nCollecting msgpack\r\n Downloading msgpack-1.0.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (316 kB)\r\n \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 317.0/317.0 kB 26.9 MB/s eta 0:00:00\r\nCollecting simplejson\r\n Downloading simplejson-3.17.6-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (137 kB)\r\n \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 137.1/137.1 kB 9.1 MB/s eta 0:00:00\r\nCollecting cbor2\r\n Downloading cbor2-5.4.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (224 kB)\r\n \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 224.1/224.1 kB 10.6 MB/s eta 0:00:00\r\nBuilding wheels for collected packages: pysodium\r\n Building wheel for pysodium (setup.py) ... done\r\n Created wheel for pysodium: filename=pysodium-0.7.12-py3-none-any.whl size=13458 sha256=72829531fd887689066dbfcb64fbeb37343ed194b999a944941240da3b42265e\r\n Stored in directory: /home/pdxjohnny/.cache/pip/wheels/20/c6/d1/e0ea5672f6614258bcd469d6721039778d2b8510bc420e8414\r\nSuccessfully built pysodium\r\nInstalling collected packages: pysodium, msgpack, lmdb, blake3, simplejson, cbor2\r\nSuccessfully installed blake3-0.3.1 cbor2-5.4.3 lmdb-1.3.0 msgpack-1.0.4 pysodium-0.7.12 simplejson-3.17.6\r\n$ pip install https://github.com/WebOfTrust/keripy/archive/refs/tags/v0.6.7-alpha.tar.gz#egg=keri\r\nDefaulting to user installation because normal site-packages is not writeable\r\nCollecting keri\r\n Downloading https://github.com/WebOfTrust/keripy/archive/refs/tags/v0.6.7-alpha.tar.gz\r\n / 3.1 MB 4.8 MB/s 0:00:00\r\n Preparing metadata (setup.py) ... done\r\nRequirement already satisfied: lmdb>=1.3.0 in /home/pdxjohnny/.local/lib/python3.10/site-packages (from keri) (1.3.0)\r\nRequirement already satisfied: pysodium>=0.7.12 in /home/pdxjohnny/.local/lib/python3.10/site-packages (from keri) (0.7.12)\r\nRequirement already satisfied: blake3>=0.3.1 in /home/pdxjohnny/.local/lib/python3.10/site-packages (from keri) (0.3.1)\r\nRequirement already satisfied: msgpack>=1.0.4 in /home/pdxjohnny/.local/lib/python3.10/site-packages (from keri) (1.0.4)\r\nRequirement already satisfied: cbor2>=5.4.3 in /home/pdxjohnny/.local/lib/python3.10/site-packages (from keri) (5.4.3)\r\nCollecting multidict>=6.0.2\r\n Downloading multidict-6.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (114 kB)\r\n \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 114.5/114.5 kB 4.2 MB/s eta 0:00:00\r\nCollecting ordered-set>=4.1.0\r\n Downloading ordered_set-4.1.0-py3-none-any.whl (7.6 kB)\r\nCollecting hio>=0.6.7\r\n Downloading hio-0.6.7.tar.gz (87 kB)\r\n \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 87.7/87.7 kB 8.3 MB/s eta 0:00:00\r\n Preparing metadata (setup.py) ... done\r\nCollecting multicommand>=1.0.0\r\n Downloading multicommand-1.0.0-py3-none-any.whl (5.8 kB)\r\nCollecting jsonschema>=4.6.0\r\n Downloading jsonschema-4.16.0-py3-none-any.whl (83 kB)\r\n \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 83.1/83.1 kB 7.6 MB/s eta 0:00:00\r\nCollecting falcon>=3.1.0\r\n Downloading falcon-3.1.0-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (8.5 MB)\r\n \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 8.5/8.5 MB 52.8 MB/s eta 0:00:00\r\nCollecting daemonocle>=1.2.3\r\n Downloading daemonocle-1.2.3.tar.gz (41 kB)\r\n \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 41.4/41.4 kB 6.2 MB/s eta 0:00:00\r\n Preparing metadata (setup.py) ... done\r\nCollecting hjson>=3.0.2\r\n Downloading hjson-3.1.0-py3-none-any.whl (54 kB)\r\n \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 54.0/54.0 kB 4.0 MB/s eta 0:00:00\r\nRequirement already satisfied: PyYaml>=6.0 in /usr/lib64/python3.10/site-packages (from keri) (6.0)\r\nCollecting apispec>=5.2.2\r\n Downloading apispec-6.0.0-py3-none-any.whl (29 kB)\r\nCollecting mnemonic>=0.20\r\n Downloading mnemonic-0.20-py3-none-any.whl (62 kB)\r\n \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 62.0/62.0 kB 6.4 MB/s eta 0:00:00\r\nRequirement already satisfied: packaging>=21.3 in /home/pdxjohnny/.local/lib/python3.10/site-packages (from apispec>=5.2.2->keri) (21.3)\r\nCollecting click\r\n Downloading click-8.1.3-py3-none-any.whl (96 kB)\r\n \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 96.6/96.6 kB 11.5 MB/s eta 0:00:00\r\nCollecting psutil\r\n Downloading psutil-5.9.3-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (292 kB)\r\n \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 292.3/292.3 kB 24.0 MB/s eta 0:00:00\r\nRequirement already satisfied: netifaces>=0.11.0 in /usr/lib64/python3.10/site-packages (from hio>=0.6.7->keri) (0.11.0)\r\nRequirement already satisfied: attrs>=17.4.0 in /usr/lib/python3.10/site-packages (from jsonschema>=4.6.0->keri) (21.4.0)\r\nRequirement already satisfied: pyrsistent!=0.17.0,!=0.17.1,!=0.17.2,>=0.14.0 in /usr/lib64/python3.10/site-packages (from jsonschema>=4.6.0->keri) (0.18.1)\r\nRequirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /home/pdxjohnny/.local/lib/python3.10/site-packages (from packaging>=21.3->apispec>=5.2.2->keri) (3.0.9)\r\nBuilding wheels for collected packages: keri, daemonocle, hio\r\n Building wheel for keri (setup.py) ... done\r\n Created wheel for keri: filename=keri-0.6.7-py3-none-any.whl size=371275 sha256=0fc4353cff6f82d93bcbe2023b5fbe34d8f19695b534280b39d6501e34fec6c4\r\n Stored in directory: /home/pdxjohnny/.cache/pip/wheels/5d/d4/7a/c5394220af3d084c08af13cdfc6c822adade30f969caa3e6be\r\n Building wheel for daemonocle (setup.py) ... done\r\n Created wheel for daemonocle: filename=daemonocle-1.2.3-py3-none-any.whl size=27547 sha256=245fcb13356d1abfade022d8ec1d71df72f6a75613e3a3a021f18c47a18a1895\r\n Stored in directory: /home/pdxjohnny/.cache/pip/wheels/90/74/0a/e42fc6338ed1604a4b23fb4ebd4c1c7c7ae716f0ecbbe6fb14\r\n Building wheel for hio (setup.py) ... done\r\n Created wheel for hio: filename=hio-0.6.7-py3-none-any.whl size=97821 sha256=c8ab55b918d13057109de99a475c729fd6b8ef9cc249e01a933ca88156cd357f\r\n Stored in directory: /home/pdxjohnny/.cache/pip/wheels/9f/a0/f7/8696eba689852f5f33237d5e67a5f71a6b084e3df25dc7080d\r\nSuccessfully built keri daemonocle hio\r\nInstalling collected packages: hjson, psutil, ordered-set, multidict, multicommand, mnemonic, jsonschema, falcon, click, hio, daemonocle, apispec, keri\r\nSuccessfully installed apispec-6.0.0 click-8.1.3 daemonocle-1.2.3 falcon-3.1.0 hio-0.6.7 hjson-3.1.0 jsonschema-4.16.0 keri-0.6.7 mnemonic-0.20 multicommand-1.0.0 multidict-6.0.2 ordered-set-4.1.0 psutil-5.9.3\r\n```\r\n\r\n- References\r\n - https://github.com/OR13/didme.me/blob/14da8e47d8a1a4bef3cc1c85968c9f8b6963d269/components/DIDMemeCreator.tsx#L59\r\n - https://github.com/OR13/didme.me/blob/14da8e47d8a1a4bef3cc1c85968c9f8b6963d269/core/DIDMeme/index.ts\r\n - https://github.com/OR13/didme.me/blob/14da8e47d8a1a4bef3cc1c85968c9f8b6963d269/core/ipfs.ts\r\n - https://github.com/desudesutalk/f5stegojs#cli-tool\r\n - https://github.com/OR13/didme.me/blob/14da8e47d8a1a4bef3cc1c85968c9f8b6963d269/components/DIDMemeCreator.tsx#L42**\r\n - https://github.com/OR13/didme.me/blob/14da8e47d8a1a4bef3cc1c85968c9f8b6963d269/components/DIDMemeCreator.tsx#L157\r\n - https://github.com/OR13/didme.me/blob/14da8e47d8a1a4bef3cc1c85968c9f8b6963d269/components/WalletCreator.tsx#L20-L70\r\n- TODO\r\n - [ ] Read https://github.com/SmithSamuelM/Papers/blob/master/whitepapers/alice-attempts-abuse-verifiable-credential.pdf\r\n - [ ] 2nd party infra\r\n - [ ] Stream of consciousness\r\n - [ ] GitHub actions webhook enable Stream of consciousness in repo setting then will dispatch workflows via stream of consciousness path logic reading trigger filtering based on `on.push.paths`\r\n - [ ] Could use DID entry points as paths to signal workflow should be triggered on that event\r\n - Could get down to operation granularity referenced inside flows for given event stream s.\r\n - Example: `paths: [\"did:alice:shouldi:contribute:clone_git_repo:ouputs.repo\"]`\r\n - Through workflow inspect we can expose this as an overlay\r\n - It can be advertised to the stream of consciousness that this workflow should be dispatched, if the overlay is enabled"
}
]
},
{
"body": "# 2022-10-22 Engineering Logs",
"replies": [
{
"body": "- Developer yellow brick road to critical velocity\r\n - search engineering logs for other refs\r\n- Use automl PRs from Edison to issue cobteacts for evaluation of hyperparamets as dataflow / operation / manifest instance (DID based encoded). automl then auto feature engineerinh "
}
]
},
{
"body": "# 2022-10-23 Engineering Logs",
"replies": [
{
"body": "## 2022-10-23 @pdxjohnny Engineering Logs\r\n\r\n- https://github.com/transmute-industries/did-jwk-pqc\r\n - Orie coincidentally posted he\u2019s working on didme.me v2 which will use post quantum json web keys.\r\n - John to pursue container image registery side of previous idea."
}
]
},
{
"body": "# 2022-10-24 Engineering Logs",
"replies": [
{
"body": "# Rolling Alice: Architecting Alice: An Image\r\n\r\n> Moved to: https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0007_an_image.md\r\n\r\n- In relation to the manifest encoded as a \"screenshot as universal API\"\r\n - https://twitter.com/mattrickard/status/1577321709350268928\r\n - https://twitter.com/David3141593/status/1584462389977939968\r\n - > TIL python's pip will execute a setup .py directly from a ZIP archive from a web URL, with mime sniffing. This allows for a nice lolbin oneliner, with payload hosted on Twitter's CDN (or anywhere else really) `$ pip install \"https://pbs\".\"twimg\".\"com/media/Ff0iwcvXEAAQDZ3.png\"` (or $ pip install https://t\".\"co/uPXauf8eTg`)\r\n > ![image](https://user-images.githubusercontent.com/5950433/197549602-f1f98e38-5f34-4d04-b64c-94d49264d189.png)\r\n > ![source_code zip](https://user-images.githubusercontent.com/5950433/197549941-b915f643-4c29-4442-bf88-2a1ad604e877.png)\r\n - Sounds like we finally have ourselves a reliable distribution mechanism! :)\r\n - need parity with text as universal API\r\n - screenshots as operations\r\n - YAML for dataflow\r\n - encourages short functions :P\r\n - Everything effectively a manifest instance, operation plus metadata\r\n - https://satori-syntax-highlighter.vercel.app/\r\n - https://twitter.com/shuding_/status/1581358324569645056\r\n - https://satori-syntax-highlighter.vercel.app/api/highlighter?code=let%20alice%20%3D%20new%20Alice()&background=%23E36FB7&lang=js&fontSize=16\r\n - https://pypi.org/project/svglib/\r\n - https://github.com/deeplook/svglib/blob/9472e067d88920debfbf6daefed32045025bf039/scripts/svg2pdf#L36-L45\r\n - https://github.com/deeplook/svglib/blob/9472e067d88920debfbf6daefed32045025bf039/svglib/svglib.py#L1402-L1414\r\n - https://github.com/deeplook/svglib/blob/9472e067d88920debfbf6daefed32045025bf039/svglib/svglib.py#L1438-L1447\r\n - It's just a screenshot of code\r\n - You just take a bunch of screenshots and put them together and that's your overlays\r\n - You can always trampoline and use one as a manifest or wrapper to resolution via a next phase storage medium.\r\n - didme.mev2\r\n - https://github.com/transmute-industries/did-jwk-pqc\r\n- https://twitter.com/amasad/status/1584327997695283200/photo/1\r\n- We'll proxy the registry off all these images\r\n\r\n```console\r\n$ curl -sfL \"https://satori-syntax-highlighter.vercel.app/api/highlighter?code=let%20alice%20%3D%20new%20Alice()&background=%23E36FB7&lang=js&fontSize=16\" | \r\n```\r\n\r\n- Future\r\n - Streaming? Solved! Video streaming APIs :P\r\n - Generate an image of Alice with all her source code packaged\r\n - pip install of image\r\n - Eventually generate videos\r\n - Container registry service endpoint can build container images or manifest images / instances"
},
{
"body": "## 2022-10-24 @pdxjohnny Engineering Logs\r\n\r\n- https://medium.com/mlearning-ai/enter-the-world-of-diffusion-models-4485fb5c5986\r\n- https://github.com/martinthomson/i-d-template\r\n- https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0007_an_image.md\r\n - Future\r\n - Lossy encoded software DNA transmitted via ad-hoc formed webrtc channels with data / component provenance encoded in-band (maybe SCITT receipts). Context aware collective intelligence is then enabled to iterate at high speed within conceptual impact bounds per group agreed policy.\r\n - Or multicast ;P\r\n - ![spaceballs-ludicous-speed](https://user-images.githubusercontent.com/5950433/197626110-69a6f9a3-9e2c-45fa-8ecc-784232c8e868.gif)\r\n- https://twitter.com/pdxjohnny/status/1584657901414928385\r\n - https://asciinema.org/a/531762\r\n\r\n[![asciicast](https://asciinema.org/a/531762.svg)](https://asciinema.org/a/531762)\r\n\r\n- https://www.nps.gov/neri/planyourvisit/the-legend-of-john-henry-talcott-wv.htm\r\n - \"If I can't beat this steam drill down, I'll die with this hammer in my hand!\" [John Henry]\r\n\r\n### Rolling Alice: Architecting Alice: An Image\r\n\r\n- References\r\n - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0007_an_image.md\r\n - https://github.com/CleasbyCode/pdvzip\r\n - https://github.com/intel/dffml/blob/alice/entities/alice/CONTRIBUTING.rst\r\n - https://satori-syntax-highlighter.vercel.app/api/highlighter?fontSize=4&lang=python&background=%23E36FB7&code=%22%22%22%0AUsage%0A%2A%2A%2A%2A%2A%0A%0A%2A%2ATODO%2A%2A%0A%0A-%20Packaging%0A%0A..%20code-block%3A%3A%20console%0A%0A%20%20%20%20%24%20echo%20Package%20python%20into%20wheel%20given%20entry%20points%20to%20overlay%20dffml.overlays.alice.please.contribute.recommended_community_standards%0A%20%20%20%20%24%20echo%20Embed%20JWK%0A%20%20%20%20%24%20echo%20JWK%20fulcio%20OIDC%3F%0A%20%20%20%20%24%20echo%20upload%20to%20twitter%20or%20somewhere%0A%20%20%20%20%24%20echo%20download%20and%20verify%20using%20JWK%2C%20show%20OIDC%20for%20online%20lookup%0A%20%20%20%20%24%20pip%20install%20package.zip%0A%20%20%20%20%24%20alice%20shouldi%20contribute%20-log%20debug%20-keys%20https%3A%2F%2Fexamples.com%2Frepowith%2Fmyconfigjson%0A%0A%22%22%22%0Aimport%20json%0Aimport%20pathlib%0Afrom%20typing%20import%20NewType%0A%0AMyConfig%20%3D%20NewType%28%22MyConfig%22%2C%20object%29%0AMyConfigUnvalidated%20%3D%20NewType%28%22MyConfigUnvalidated%22%2C%20object%29%0AMyConfigProjectName%20%3D%20NewType%28%22MyConfigProjectName%22%2C%20str%29%0AMyConfigDirectory%20%3D%20NewType%28%22MyConfigDirectory%22%2C%20str%29%0A%0A%0Adef%20read_my_config_from_directory_if_exists%28%0A%20%20%20%20directory%3A%20MyConfigDirectory%2C%0A%29%20-%3E%20MyConfigUnvalidated%3A%0A%20%20%20%20%22%22%22%0A%20%20%20%20%3E%3E%3E%20import%20json%0A%20%20%20%20%3E%3E%3E%20import%20pathlib%0A%20%20%20%20%3E%3E%3E%20import%20tempfile%0A%20%20%20%20%3E%3E%3E%0A%20%20%20%20%3E%3E%3E%20with%20tempfile.TemporaryDirectory%28%29%20as%20tempdir%3A%0A%20%20%20%20...%20%20%20%20%20_%20%3D%20pathlib.Path%28tempdir%2C%20%22.myconfig.json%22%29.write_text%28json.dumps%28%7B%22name%22%3A%20%22Hello%20World%22%7D%29%29%0A%20%20%20%20...%20%20%20%20%20print%28read_my_config_from_directory_if_exists%28tempdir%29%29%0A%20%20%20%20%7B%27name%27%3A%20%27Hello%20World%27%7D%0A%20%20%20%20%22%22%22%0A%20%20%20%20path%20%3D%20pathlib.Path%28directory%2C%20%22.myconfig.json%22%29%0A%20%20%20%20if%20not%20path.exists%28%29%3A%0A%20%20%20%20%20%20%20%20return%0A%20%20%20%20return%20json.loads%28path.read_text%28%29%29%0A%0A%0Adef%20validate_my_config%28%0A%20%20%20%20config%3A%20MyConfigUnvalidated%2C%0A%29%20-%3E%20MyConfig%3A%0A%20%20%20%20%23%20TODO%28security%29%20json%20schema%20valiation%20of%20myconfig%20%28or%0A%20%20%20%20%23%20make%20done%20automatically%20by%20operation%20manifest%20schema%0A%20%20%20%20%23%20validation%20on%20InputNetwork%2C%20maybe%2C%20just%20one%20option%2C%0A%20%20%20%20%23%20or%20maybe%20similar%20to%20how%20prioritizer%20gets%20applied%2C%0A%20%20%20%20%23%20or%20maybe%20this%20is%20an%20issue%20we%20already%20track%3A%20%231400%29%0A%20%20%20%20return%20config%0A%0A%0Adef%20my_config_project_name%28%0A%20%20%20%20config%3A%20MyConfig%2C%0A%29%20-%3E%20MyConfigProjectName%3A%0A%20%20%20%20%22%22%22%0A%20%20%20%20%3E%3E%3E%20print%28my_config_project_name%28%7B%22name%22%3A%20%22Hello%20World%22%7D%29%29%0A%20%20%20%20Hello%20World%0A%20%20%20%20%22%22%22%0A%20%20%20%20return%20config%5B%22name%22%5D%0A\r\n - `$ python -c 'import sys, urllib.parse; sys.stdout.write(urllib.parse.quote(sys.stdin.read(), safe=\"\"))'`\r\n - Orie mentioned \"Only twitter web client works for PNGs and they have to be under 900 pixels.\"\r\n - https://twitter.com/OR13b/status/1584669807827648512?s=20&t=Xec9v05emwSphzT6W0R8PA\r\n - https://github.com/ossf/scorecard/blob/main/options/flags.go\r\n\r\n```console\r\n$ git clone https://github.com/CleasbyCode/pdvzip\r\n$ cd pdvzip/ && $ g++ pdvzip.cpp -o pdvzip\r\n$ dffml service dev create blank alice-shouldi-contribute-openssf-scorecard\r\n$ cd alice-shouldi-contribute-openssf-scorecard\r\n$ sed -i 's/zip_safe = False/zip_safe = True/' setup.cfg\r\n$ sed -i 's/# entry_points/entry_points/' setup.cfg\r\n$ echo -e '[dffml.overlays.alice.shouldi.contribute]\\nOpenSSFScorecard = alice_shouldi_contribute_openssf_scorecard.operations' | tee entry_points.txt\r\n```\r\n\r\n**alice_shouldi_contribute_openssf_scorecard/operations.py**\r\n\r\n```python\r\n\"\"\"\r\nUsage\r\n*****\r\n\r\n**TODO**\r\n\r\n- Packaging\r\n\r\n.. code-block:: console\r\n\r\n $ echo Package python into wheel given entry points to overlay dffml.overlays.alice.please.contribute.recommended_community_standards\r\n $ echo Embed JWK\r\n $ echo JWK fulcio OIDC?\r\n $ echo upload to twitter or somewhere\r\n $ echo download and verify using JWK, show OIDC for online lookup\r\n $ pip install package.zip\r\n $ alice shouldi contribute -log debug -keys https://examples.com/repowith/myconfigjson\r\n\r\n\"\"\"\r\nimport os\r\nimport json\r\nimport pathlib\r\nimport platform\r\nimport contextlib\r\nfrom typing import Dict, NewType\r\n\r\nimport dffml\r\nimport dffml_feature_git.feature.definitions\r\n\r\n\r\n@dffml.config\r\nclass EnsureScorecardConfig:\r\n cache_dir: pathlib.Path = dffml.field(\r\n \"Cache directory to store downloads in\",\r\n default_factory=lambda: pathlib.Path(os.getcwd()),\r\n )\r\n platform_urls: Dict[str, Dict[str, str]] = dffml.field(\r\n \"Mapping of platform.system() return values to scorecard download URLs with hashes\",\r\n default_factory=lambda: {\r\n \"Linux\": {\r\n \"url\": \"https://github.com/ossf/scorecard/releases/download/v4.8.0/scorecard_4.8.0_linux_amd64.tar.gz\",\r\n \"expected_hash\": \"8e90236b3e863447fc98f6131118cd1f509942f985f30ba02825c5d67f2b9999f0ac5aa595bb737ef971788c48cd20c9\",\r\n },\r\n },\r\n )\r\n\r\n\r\nOpenSSFScorecardBinaryPath = NewType(\"OpenSSFScorecardBinaryPath\", str)\r\n\r\n\r\n@dffml.op(\r\n config_cls=EnsureScorecardConfig, imp_enter={\"stack\": contextlib.AsyncExitStack,},\r\n)\r\nasync def ensure_scorecard(self) -> OpenSSFScorecardBinaryPath:\r\n scorecard = await dffml.cached_download_unpack_archive(\r\n **{\r\n \"file_path\": self.parent.config.cache_dir.joinpath(\"scorecard.tar.gz\"),\r\n \"directory_path\": self.parent.config.cache_dir.joinpath(\"scorecard-download\"),\r\n # Use whatever values are appropriate for the system we are on\r\n **self.parent.config.platform_urls[platform.system()],\r\n }\r\n )\r\n self.parent.stack.enter_context(dffml.prepend_to_path(scorecard))\r\n binary_path = list(scorecard.glob(\"scorecard*\"))[0].resolve()\r\n return binary_path\r\n\r\n\r\n# TODO https://koxudaxi.github.io/datamodel-code-generator/ from schema\r\nOpenSSFScorecardResults = NewType(\"OpenSSFScorecardResults\", dict)\r\n\r\n\r\n@dffml.op\r\nasync def openssf_scorecard(\r\n self,\r\n scorecard_path: OpenSSFScorecardBinaryPath,\r\n repo: dffml_feature_git.feature.definitions.git_repository,\r\n) -> OpenSSFScorecardResults:\r\n cmd = [\r\n scorecard_path,\r\n \"--format=json\",\r\n f\"--local={repo.directory}\"\r\n ]\r\n async for event, result in dffml.run_command_events(\r\n cmd,\r\n cwd=repo.directory,\r\n env={\r\n **os.environ,\r\n },\r\n events=[dffml.Subprocess.STDOUT],\r\n logger=self.logger,\r\n ):\r\n return json.loads(result.decode())\r\n\r\n```\r\n\r\n```conole\r\n$ pip install -e .\r\n$ dffml service dev entrypoints list dffml.overlays.alice.shouldi.contribute\r\nOpenSSFScorecard = alice_shouldi_contribute_openssf_scorecard.operations -> alice-shouldi-contribute-openssf-scorecard 0.1.dev1+g614cd2a.d20221025 (/home/coder/.local/lib/python3.9/site-packages)\r\n$ alice -log debug shouldi contribute -keys https://${GH_ACCESS_TOKEN}@github.com/pdxjohnny/httptest\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:Instantiating operation implementation alice_shouldi_contribute_openssf_scorecard.operations:ensure_scorecard(alice_shouldi_contribute_openssf_scorecard.operations:ensure_scorecard) with default config: EnsureScorecardConfig(cache_dir=PosixPath('/tmp/tmp.hgZT8hhxqR/didme.me/pdvzip/alice-shouldi-contribute-openssf-scorecard'), platform_urls={'Linux': {'url': 'https://github.com/ossf/scorecard/releases/download/v4.8.0/scorecard_4.8.0_linux_amd64.tar.gz', 'expected_hash': '8e90236b3e863447fc98f6131118cd1f509942f985f30ba02825c5d67f2b9999f0ac5aa595bb737ef971788c48cd20c9'}})\r\nDEBUG:dffml.AliceShouldiContributeOpenssfScorecardOperations:EnsureScorecardImplementation:EnsureScorecardConfig(cache_dir=PosixPath('/tmp/tmp.hgZT8hhxqR/didme.me/pdvzip/alice-shouldi-contribute-openssf-scorecard'), platform_urls={'Linux': {'url': 'https://github.com/ossf/scorecard/releases/download/v4.8.0/scorecard_4.8.0_linux_amd64.tar.gz', 'expected_hash': '8e90236b3e863447fc98f6131118cd1f509942f985f30ba02825c5d67f2b9999f0ac5aa595bb737ef971788c48cd20c9'}})\r\n```\r\n\r\n- It's running the `ensure_scorecard` but not the scan.\r\n\r\n```console\r\n$ dffml service dev export alice.cli:ALICE_COLLECTOR_DATAFLOW | tee alice_shouldi_contribute.json\r\n$ dffml dataflow diagram alice_shouldi_contribute.json | tee alice_shouldi_contribute.mmd\r\n```\r\n\r\n- Found that we are using `dffml_feature_git.feature.definitions`\r\n - Rather than we had first tried `AliceGitRepo`, we need to update the shouldi code to have Alice specifics.\r\n\r\n\r\n```console\r\n$ alice -log debug shouldi contribute -keys https://${GH_ACCESS_TOKEN}@github.com/pdxjohnny/httptest\r\nTraceback (most recent call last):\r\n File \"/src/dffml/dffml/df/memory.py\", line 1291, in run_dispatch\r\n outputs = await self.run(\r\n File \"/src/dffml/dffml/df/memory.py\", line 1256, in run\r\n return await self.run_no_retry(ctx, octx, operation, inputs)\r\n File \"/src/dffml/dffml/df/memory.py\", line 1233, in run_no_retry\r\n outputs = await opctx.run(inputs)\r\n File \"/src/dffml/dffml/df/base.py\", line 547, in run\r\n result = await result\r\n File \"/tmp/tmp.hgZT8hhxqR/didme.me/pdvzip/alice-shouldi-contribute-openssf-scorecard/alice_shouldi_contribute_openssf_scorecard/operations.py\", line 64, in openssf_scorecard\r\n async for event, result in dffml.run_command_events(\r\n File \"/src/dffml/dffml/util/subprocess.py\", line 83, in run_command_events\r\n raise RuntimeError(\r\nRuntimeError: [PosixPath('/tmp/tmp.hgZT8hhxqR/didme.me/pdvzip/alice-shouldi-contribute-openssf-scorecard/scorecard-download/scorecard-linux-amd64'), '--format=json', '--local=/tmp/dffml-feature-git-ly4u_eds']: Error: check runtime error: Dependency-Update-Tool: internal error: Search: unsupported feature\r\n{\"date\":\"2022-10-25\",\"repo\":{\"name\":\"file:///tmp/dffml-feature-git-ly4u_eds\",\"commit\":\"unknown\"},\"scorecard\":{\"version\":\"v4.8.0\",\"commit\":\"c40859202d739b31fd060ac5b30d17326cd74275\"},\"score\":6.8,\"checks\":[{\"details\":null,\"score\":10,\"reason\":\"no dangerous workflow patterns detected\",\"name\":\"Dangerous-Workflow\",\"documentation\":{\"url\":\"https://github.com/ossf/scorecard/blob/c40859202d739b31fd060ac5b30d17326cd74275/docs/checks.md#dangerous-workflow\",\"short\":\"Determines if the project's GitHub Action workflows avoid dangerous patterns.\"}},{\"details\":null,\"score\":-1,\"reason\":\"internal error: Search: unsupported feature\",\"name\":\"Dependency-Update-Tool\",\"documentation\":{\"url\":\"https://github.com/ossf/scorecard/blob/c40859202d739b31fd060ac5b30d17326cd74275/docs/checks.md#dependency-update-tool\",\"short\":\"Determines if the project uses a dependency update tool.\"}},{\"details\":null,\"score\":10,\"reason\":\"license file detected\",\"name\":\"License\",\"documentation\":{\"url\":\"https://github.com/ossf/scorecard/blob/c40859202d739b31fd060ac5b30d17326cd74275/docs/checks.md#license\",\"short\":\"Determines if the project has defined a license.\"}},{\"details\":null,\"score\":9,\"reason\":\"dependency not pinned by hash detected -- score normalized to 9\",\"name\":\"Pinned-Dependencies\",\"documentation\":{\"url\":\"https://github.com/ossf/scorecard/blob/c40859202d739b31fd060ac5b30d17326cd74275/docs/checks.md#pinned-dependencies\",\"short\":\"Determines if the project has declared and pinned the dependencies of its build process.\"}},{\"details\":null,\"score\":0,\"reason\":\"non read-only tokens detected in GitHub workflows\",\"name\":\"Token-Permissions\",\"documentation\":{\"url\":\"https://github.com/ossf/scorecard/blob/c40859202d739b31fd060ac5b30d17326cd74275/docs/checks.md#token-permissions\",\"short\":\"Determines if the project's workflows follow the principle of least privilege.\"}}],\"metadata\":null}\r\n2022/10/25 00:30:47 error during command execution: check runtime error: Dependency-Update-Tool: internal error: Search: unsupported feature\r\n\r\n\r\nThe above exception was the direct cause of the following exception:\r\n\r\nTraceback (most recent call last):\r\n File \"/home/coder/.local/bin/alice\", line 8, in <module>\r\n sys.exit(AliceCLI.main())\r\n File \"/src/dffml/dffml/util/cli/cmd.py\", line 286, in main\r\n result = loop.run_until_complete(cls._main(*argv[1:]))\r\n File \"/.pyenv/versions/3.9.13/lib/python3.9/asyncio/base_events.py\", line 647, in run_until_complete\r\n return future.result()\r\n File \"/src/dffml/dffml/util/cli/cmd.py\", line 252, in _main\r\n return await cls.cli(*args)\r\n File \"/src/dffml/dffml/util/cli/cmd.py\", line 238, in cli\r\n return await cmd.do_run()\r\n File \"/src/dffml/dffml/util/cli/cmd.py\", line 215, in do_run\r\n return [res async for res in self.run()]\r\n File \"/src/dffml/dffml/util/cli/cmd.py\", line 215, in <listcomp>\r\n return [res async for res in self.run()]\r\n File \"/src/dffml/dffml/cli/dataflow.py\", line 287, in run\r\n async for record in self.run_dataflow(\r\n File \"/src/dffml/dffml/cli/dataflow.py\", line 272, in run_dataflow\r\n async for ctx, results in octx.run(\r\n File \"/src/dffml/dffml/df/memory.py\", line 1713, in run\r\n raise exception\r\n File \"/src/dffml/dffml/df/memory.py\", line 1881, in run_operations_for_ctx\r\n raise OperationException(\r\ndffml.df.base.OperationException: alice_shouldi_contribute_openssf_scorecard.operations:openssf_scorecard({'scorecard_path': OpenSSFScorecardBinaryPath, 'repo': git_repository}): {'scorecard_path': PosixPath('/tmp/tmp.hgZT8hhxqR/didme.me/pdvzip/alice-shouldi-contribute-openssf-scorecard/scorecard-download/scorecard-linux-amd64'), 'repo': GitRepoSpec(directory='/tmp/dffml-feature-git-ly4u_eds', URL='https://@github.com/pdxjohnny/httptest')}\r\n$ python -c 'import yaml, json,sys; print(yaml.dump(json.loads(sys.stdin.read())))' < error.json\r\n```\r\n\r\n```yaml\r\nchecks:\r\n- details: null\r\n documentation:\r\n short: Determines if the project's GitHub Action workflows avoid dangerous patterns.\r\n url: https://github.com/ossf/scorecard/blob/c40859202d739b31fd060ac5b30d17326cd74275/docs/checks.md#dangerous-workflow\r\n name: Dangerous-Workflow\r\n reason: no dangerous workflow patterns detected\r\n score: 10\r\n- details: null\r\n documentation:\r\n short: Determines if the project uses a dependency update tool.\r\n url: https://github.com/ossf/scorecard/blob/c40859202d739b31fd060ac5b30d17326cd74275/docs/checks.md#dependency-update-tool\r\n name: Dependency-Update-Tool\r\n reason: 'internal error: Search: unsupported feature'\r\n score: -1\r\n- details: null\r\n documentation:\r\n short: Determines if the project has defined a license.\r\n url: https://github.com/ossf/scorecard/blob/c40859202d739b31fd060ac5b30d17326cd74275/docs/checks.md#license\r\n name: License\r\n reason: license file detected\r\n score: 10\r\n- details: null\r\n documentation:\r\n short: Determines if the project has declared and pinned the dependencies of its\r\n build process.\r\n url: https://github.com/ossf/scorecard/blob/c40859202d739b31fd060ac5b30d17326cd74275/docs/checks.md#pinned-dependencies\r\n name: Pinned-Dependencies\r\n reason: dependency not pinned by hash detected -- score normalized to 9\r\n score: 9\r\n- details: null\r\n documentation:\r\n short: Determines if the project's workflows follow the principle of least privilege.\r\n url: https://github.com/ossf/scorecard/blob/c40859202d739b31fd060ac5b30d17326cd74275/docs/checks.md#token-permissions\r\n name: Token-Permissions\r\n reason: non read-only tokens detected in GitHub workflows\r\n score: 0\r\ndate: '2022-10-25'\r\nmetadata: null\r\nrepo:\r\n commit: unknown\r\n name: file:///tmp/dffml-feature-git-ly4u_eds\r\nscore: 6.8\r\nscorecard:\r\n commit: c40859202d739b31fd060ac5b30d17326cd74275\r\n version: v4.8.0\r\n```\r\n\r\n- TODO\r\n - [ ] Portrait screenshots?\r\n - [ ] Split into two screenshots, one upstream, one overlay\r\n - [ ] Another screenshot serving as their manifest to do both"
}
]
},
{
"body": "# 2022-10-25 Engineering Logs\r\n\r\n- [ ] Cleanup progress report transcripts and post within Architecting Alice as numbered files 0000_\r\n- [ ] GitHub Container Registry or Digital Ocean space or something as registry with static content?\r\n - https://github.com/MrE-Fog/static-container-registry\r\n- [ ] Stream of Consciousness to trigger downstream rebuilds\r\n - https://github.com/intel/dffml/pull/1420\r\n - Ensure we show at least one downstream rebuild\r\n - `dffml`\r\n - `dffml[all]`\r\n - Future\r\n - Enable downstream events for builds of different tags / layers\r\n within existing dockerfiles and push them (if intermediate rebuilt).\r\n- [ ] Fix DFFML CI\r\n - https://github.com/intel/dffml/actions/runs/3318045403\r\n - Not looking good...\r\n - https://github.com/intel/dffml/pull/1420\r\n- [ ] Fix Alice CI\r\n- [ ] 2ndparty\r\n- [ ] RFCv2\r\n- [ ] Call for contribution again\r\n- [ ] Alice on chain\r\n - [ ] https://github.com/intel/dffml/discussions/1369#discussioncomment-2683370\r\n - [ ] Distributed system context store: web3 + manifests\r\n - [ ] Wonderland: The nickname we give the collective mass of thoughts in existence. This all the data in Alice on chain.\r\n - [ ] https://github.com/intel/dffml/issues/1377\r\n- [x] Dataflow as class\r\n- [ ] add the dataflow we executed to the chain. The next execution it should load data from some location via overlay to add this top level system context to the hostory of executed contexts. And the top level context should be linked both ways to the orignal external inputs (UCAN?)\r\n- [ ] Cached flows to did chain then to backing storage via default input network as dataflow that does this to did in background. Start with json so they get saved to file. Add identity as input to top level context. Identiy could have parent input objects. such as this is of definition github username, which you could then have an operation that takes github usernames and outputs their SPDXIDs. When that operation SPDXID output is run through the deafult DID input network, a strategic plan (default overlayed dataflow to the default input network) which does this forking stuff. Could have location for user overlays in .local or something. When a context is thought of or hypothesised or executed it will be in the user context herstory. Users can optionally add overlays to their default flows (kind of like systemd). This could enable a user to overlay if im worjing within this cwd for this top level system cobtext run these commands. Alice as shell\r\n - [ ] long term: fork to save to chain on process exit (can we fork or coredump somehow on atexit?) by default.\r\n- [ ] cve bin tool checker from chain\r\n- [ ] https://gitbom.dev/\r\n- [ ] Fix TODO on watching new contexts in memory orchestrator OR maybe this is fixed via the seperate linage? Probably needs event filtration similar to run_command so by default if not set in kwargs only \r\n- [ ] Operations and their config as inputs\r\n - [ ] Unify typing via parent type / primitive as Input parents\r\n - [ ] Can have operations that filter and old let through Input objects with specific parents or parents in specific order\r\n - [ ] The config dataflow, the startup on is the same as this new instantiate operations from Input objects. We can add shared config becomes a bunch of input objects. We have something like flow. \u2018config_flow\u2019 maybe which is where we\u2019ll do initialization. Actually, lets just re use the main execution. Instantiate operations via an operation that instantiates them. We can then for each operation, use our newfound input filtering operations to form appropriate dependency graphs on order of instantiatation and usage of config objects (when executing in this top level context) we can then pass config and shared config as input objects to build config classes with references to same underlying data in memory. This solves shared config #720\r\n - [ ] Locality\r\n - [ ] Operation name\r\n - [ ] Stub values added as parents to outputs. Structured logs from an operation added as parents to operation outputs\r\n- [ ] Use newfound operations and inputs with stub values\r\n- [ ] Run an overlayed flow with output operations to build c4models of our dataflow based on parent input analysis. Generate architecture diagrams from it.\r\n- [ ] Unify type system with Python\u2019s type system via newfound input parent chains (#188)\r\n- [ ] prioritizer\r\n - [ ] statigic plans (similar to dataflow as class method output grabbers)\r\n - [ ] gatekeeper\r\n- [ ] Inventory\r\n- [ ] Creation based on datatypes\r\n - [ ] Input to dataclass field mappings\r\n - [ ] Quicker syntax for dataflow definition\r\n- [ ] Have strategic plan models predict what inputs and outputs will exist to reach desired output metrics\r\n - [ ] Alice create threat model of code base\r\n - [ ] strategic plan for threat model completeness\r\n - [ ] keeps suggesting new system contexts, or incentivizing creation of new system contexts by other strategic plans so as to drive up completeness metric\r\n - [ ] New contexts are created by finding different sets of operations connected differently via flow modifications where applicable\r\n - [ ] There new contexts are run through a validity check to ensure all inputs to operations are consumed and all outputs are consumed by strategic plans somewhere.\r\n - [ ] Provide functionality to audit unused output values.\r\n - [ ] Gatekeeper and prioritizer models help decide what gets run and when.\r\n - [ ] top level system context we are executing in takes an input completeness for an organizationally applied strategic plan. Likely this completeness is a situation where we have a property of an `@config` which maps to a definition with something to do with completeness.\r\n - [ ] Target example around DFFML itself and it's development, and other OSS libs\r\n\r\n---\r\n\r\nsystem context includes\r\n\r\n- I/O\r\n - Any cached values\r\n- Prioritizer\r\n - Strategic plans\r\n - Some agents will not work with you unless they can run a strategic plan across a system context they are given to to execute to ensure that the system context has active provenance information that tells them to their desired level of assurance (trusted party vouch, attestation as an option)\r\n - We need to log which plans we execute as a part of the prioritizer using structured metrics or as an output of some kind\r\n - Gatekeeper\r\n- Dataflow\r\n\r\n---\r\n\r\n### Note\r\n\r\n- If you don't make a threat model, your attacker will make it for you. Daisy she thinks about making but then the rabbit is more interesting and now were down the hole. oops too late, should have made the threat model first. Let's hurry up and make it quickly before we get too deep into Wonderland.\r\n- shouldi, wonder about installing packages. Explain how that increases threat surface.\r\n- write about how we extended shouldi and go into technical details.\r\n- Building markdown docs with mermaid diagrams\r\n\r\n---\r\n\r\n## Living THREATS.md\r\n\r\nInstall Alice https://github.com/intel/dffml/tree/alice/entities/alice\r\n\r\nCreate the `THREATS.md` file\r\n\r\n```console\r\n$ alice threats \\\r\n -inputs \\\r\n models/good.json=ThreatDragonThreatModelPath \\\r\n models/GOOD_THREATS.md=ThreatsMdPath\r\n```\r\n\r\nWe made `auditor_overlay.py` which is a data flow which calls the auditor. We\r\nuse `sed` to direct the data flow to run on the path to the threat model from\r\nThreat Dragon used as input.\r\n\r\n```console\r\n$ dffml service dev export auditor_overlay:AUDITOR_OVERLAY \\\r\n -configloader yaml \\\r\n | sed -e 's/auditor_overlay:audit.inputs.ltm/ThreatDragonThreatModelPath/g' \\\r\n | tee auditor_overlay.yaml\r\n```\r\n\r\nGenerate `GOOD_THREATS.md` with auditing overlay.\r\n\r\n```console\r\n$ alice threats -log debug \\\r\n -overlay auditor_overlay.yaml \\\r\n -inputs \\\r\n models/good.json=ThreatDragonThreatModelPath \\\r\n models/GOOD_THREATS.md=ThreatsMdPath\r\n```\r\n\r\nGenerate `BAD_THREATS.md` with auditing overlay.\r\n\r\n```console\r\n$ alice threats -log debug \\\r\n -overlay auditor_overlay.yaml \\\r\n -inputs \\\r\n models/bad.json=ThreatDragonThreatModelPath \\\r\n models/BAD_THREATS.md=ThreatsMdPath\r\n```\r\n\r\nDump out to HTTP to copy to GitHub for rendering.\r\n\r\n```console\r\n$ (echo -e 'HTTP/1.0 200 OK\\n' && cat models/GOOD_THREATS.md) | nc -Nlp 9999;\r\n$ (echo -e 'HTTP/1.0 200 OK\\n' && cat models/BAD_THREATS.md) | nc -Nlp 9999;\r\n```",
"replies": [
{
"body": "## 2022-10-25 @pdxjohnny Engineering Logs\r\n\r\n- https://twitter.com/hardmaru/status/1584731173426954241\r\n - > Backprop is just another \u201chand-engineered\u201d feature\r\n - grep discussion for more details\r\n- Sourced today's team log from https://github.com/intel/dffml/commit/208ac457b378aab86d28775d0f10d0bc25b0a212#diff-986012018712addda9630dba0adf9035e6f8aae84e4410390f99cbc5618c574e\r\n- stream of contsiouness enable gitops for entities (agents, humans, etc.) config for their background listenting notifiaction prefs\r\n - Like a robots.txt for should you notify me, same as we are doing with the plugins\r\n- https://github.com/jurgisp/memory-maze\r\n - https://twitter.com/danijarh/status/1584893538180874241\r\n- Future\r\n - Expand upon [Volume 1: Chapter 1: Down the Dependency Rabbit-Hole Again](https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0001_coach_alice/0001_down_the_dependency_rabbit_hole_again.md) to add dynamic analysis, aka tell me what the delta on CI env is. \r\n- Misc people to circle back with\r\n - John Whiteman was planning on writing collectors and analyzing AST\r\n - Michael could help us generate PDFs from Sphinx sites\r\n- https://twitter.com/OR13b/status/1584975480889147392\r\n - Need to dig into this and why entityType got the banhammer"
},
{
"body": "## 2022-10-25 Alice Initiative welcome aboard!\r\n\r\n- Harsh joining us to do some Python package analysis work\r\n- Alice thread: https://github.com/intel/dffml/discussions/1406?sort=new\r\n- This work feeds into the following tutorial\r\n - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0001_coach_alice/0001_down_the_dependency_rabbit_hole_again.md\r\n- [shouldi: deptree: Create dependency tree of project \u00b7 Issue #596 \u00b7 intel/dffml](https://github.com/intel/dffml/issues/596)\r\n - https://github.com/intel/dffml/commits/shouldi_dep_tree\r\n - > The idea behind the work that was done so far in the above branch was to produce the full dependency tree for a given python package.\r\n- Documentation writing process\r\n - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0004_writing_the_wave.md#vision\r\n- Contributing Documentation\r\n - https://github.com/intel/dffml/blob/alice/entities/alice/CONTRIBUTING.rst\r\n- Troubleshooting \u2014 DFFML fd401e426 documentation\r\n - https://intel.github.io/dffml/main/troubleshooting.html#entrypointnotfound\r\n- Next steps\r\n - Harsh will first focus on filling out the other two functions with unit tests for different file contents\r\n - These functions / files can be standalone at first, we can integrate later.\r\n - https://github.com/intel/dffml/blob/alice/entities/alice/CONTRIBUTING.rst#writing-an-overlay\r\n - Harsh to ping John as needed.\r\n - Harsh to comment in issue with commands run and errors and so forth so we can copy pate into the associated tutorial later.\r\n - Plans for automation of documentation writing: https://github.com/intel/dffml/commit/74781303fae19b03326878d184a49ac93543749c?short_path=76e9bfe#diff-76e9bfe1c05d4426559fada22595ca1f9a76fd0fc98609dfbbde353d10fa77db\r\n\r\nhttps://github.com/intel/dffml/blob/0a2e053f5f8e361054f329a3f763982fb1e4d1f7/examples/shouldi/tests/test_dep_tree.py#L36-L71"
}
]
},
{
"body": "# 2022-10-26 Engineering Logs\r\n\r\n- https://en.m.wikipedia.org/wiki/Knowledge_argument \r\n - `alias Alice=Mary`\r\n - grep\r\n - fourth eye \ud83d\udc41\ufe0f \r\n - Scientific process\r\n\r\nTODO Alice gif for black and white to color (the acquisition of the fourth eye, when she steps through the looking glass)\r\n",
"replies": [
{
"body": "## 2022-10-26 @sedihglow Engineering Logs\r\n\r\n- https://github.com/sedihglow/red_black_tree\r\n- https://gist.github.com/sedihglow/770ed4e472935c5ab302d069b64280a8\r\n - How Python's builtin `sorted()` works\r\n - https://docs.python.org/3/library/functions.html#sorted\r\n- References\r\n - http://www.microhowto.info/howto/convert_from_html_to_formatted_plain_text.html\r\n - `$ lynx -dump -display_charset UTF-8 \"https://docs.docker.com/engine/install/ubuntu/\"`\r\n - https://unix.stackexchange.com/questions/336253/how-to-find-gnome-terminal-currently-used-profile-with-cmd-line\r\n - `--save-config` has been removed\r\n- Docker\r\n - https://github.com/pdxjohnny/dockerfiles/blob/406f0b94838f7dcd1792c394061a2ee18c4f7487/sshd/Dockerfile\r\n- https://github.com/intel/dffml/blob/alice/entities/alice/CONTRIBUTING.rst#cloning-the-repo\r\n- Vim\r\n - Exit insert mode `Ctrl-[`\r\n\r\n```console\r\n$ git clone -b alice https://github.com/intel/dffml\r\n$ cd dffml/entities/alice\r\n$ python -m pip install \\\r\n -e .[dev] \\\r\n -e ../../ \\\r\n -e ../../examples/shouldi/ \\\r\n -e ../../feature/git/ \\\r\n -e ../../operations/innersource/ \\\r\n -e ../../configloader/yaml/\r\n```"
},
{
"body": "## 2022-10-26 @pdxjohnny Engineering Logs\r\n\r\n- https://github.com/intel/dffml/pull/1420\r\n- https://en.m.wikipedia.org/wiki/Knowledge_graph\r\n- https://github.com/peacekeeper/uni-resolver-driver-did-example\r\n- https://medium.com/transmute-techtalk/neo4j-graph-data-science-with-verifiable-credential-data-98b806f2ad78\r\n- with regards to thought arbitrage\r\n - Decentralised Finance and Automated Market Making: Execution and Speculation\r\n - https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4144743\r\n- TPM\r\n - https://0pointer.de/blog/brave-new-trusted-boot-world.html\r\n- AutoML\r\n - https://github.com/automl/TabPFN\r\n- Updated Alice in CLI help, OS DecentrAlice sshd_banner, Google Drive AliceisHere, and here in this thread below.\r\n\r\n![alice-looking-up-no-shadow](https://user-images.githubusercontent.com/5950433/198141595-f7db1356-5446-49df-a0d7-731010fe1326.png)"
}
]
},
{
"body": "# 2022-10-27 Engineering Logs\r\n\r\n> Source: https://pdxjohnny.github.io/terminal-quickstart/\r\n\r\n[![terminal-quickstart](https://github.com/pdxjohnny/pdxjohnny.github.io/raw/dev/static/images/terminal-quickstart.gif)](https://pdxjohnny.github.io/terminal-quickstart/)\r\n\r\n- So called \"effective altruism movement\" is not aligned\r\n - What you are now is what you are becoming.\r\n - Same goes for the collective.\r\n- Example threat model scenario\r\n - Imagine a software security researcher named Alice.\r\n - Alice want wants to publicize her scientific research so\r\n as to engage in discourse in the community and further\r\n the [state of the art](https://en.wikipedia.org/wiki/State_of_the_art).\r\n - Why she decided furthering the state of the art in field X\r\n is out of scope for this scenario. It would have been\r\n defined by reward mechanisms and the top level system\r\n context's gatekeeper and priroritizer. Alice may in this situation also be a tenant attempting to escape the sandbox of her top level system context\u2019s multi tenant environment, she (sum of parts, inputs within context) herself a context.\r\n - Alice searches for communities to engage with, forums\r\n chats, activity, any signs of life in the conceptual field\r\n (the train of thought).\r\n - Alice's query yields a malicious attacker controlled community.\r\n - Acceleration in this community's train of thought is\r\n measured to be outside of acceptable impact bounds to her values\r\n / ethics / strategic principles and plans. She determines this by\r\n predicting future state.\r\n - How does Alice know that she should avoid working with\r\n unaligned entities? How did she determine it was detrimental\r\n to her strategic principles when viewed from lifecycle scope?\r\n - Traversal of trust graphs!\r\n - [2022-10-27 IETF SCITT Technical Meeting Notes](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-3983087)\r\n - https://github.com/intel/dffml/issues/1315\r\n - > Just think about it like \ud83d\udc22 turtling in an RTS game or like being zen. You just don\u2019t engage, you dont care, you\u2019re focused with your alys in your ad hoc formed groups\r\n - open source community cross talk / innersource: example set CNCF projects are aligned trees from similar roots.\r\n - you look at other parts of your lifecycle to see how you can position yourself within the multi dimensional strategic field landscape which your top level strategic principles apply to within a context\r\n - wardly maps we\r\n- TODO\r\n - [ ] analysis of kubernetes community handling of aligned events and community response to unaligned actors",
"replies": [
{
"body": "## 2022-10-27 @pdxjohnny Engineering Logs\r\n\r\n- Version Control Systems\r\n - https://github.com/facebookexperimental/eden\r\n - https://www.youtube.com/watch?v=bx_LGilOuE4&feature=youtu.be\r\n - https://twitter.com/bernhardsson/status/1585652692701036544\r\n- Well I'll be, I forgot I already wrote a terminal quickstart doc until I accidently opened the attach file dialog and saw this gif I'd been meaning to add here.\r\n - There is some stuff in this thread about teaching alice to use the shell.\r\n - consoletest commands to graph nueral network markov chains?\r\n - https://github.com/pdxjohnny/consoletest\r\n - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0003_a_shell_for_a_ghost.md\r\n - Maybe we do this as a future tutorial to Architecting Alice: A Shell for A Ghost\r\n- https://threadreaderapp.com/thread/1584623497284026368\r\n - https://indieweb.org/Micropub\r\n - https://wordpress.org/plugins/indieauth/\r\n - https://indieweb.org/Micropub/Servers\r\n- TODO\r\n - [ ] DID resolver / proxy for https://github.com/facebookexperimental/eden"
},
{
"body": "## 2022-10-27 IETF SCITT Technical \r\n\r\n- https://datatracker.ietf.org/wg/scitt/about/\r\n- https://github.com/ietf-scitt/scitt-web/blob/065ae3bf467e236d18774d954b5784d97c43ec17/_posts/2022-10-25-distributing-artifacts.md\r\n- Zulip and Slack exists for IETF\r\n - Comply with appropriate legal guidance\r\n - Have fun creating channels an chatting otherwise\r\n - Do not assume privacy, this is a hosted service.\r\n - https://xkcd.com/1810/\r\n\r\n![XKCD 1810: Chat Systems](https://user-images.githubusercontent.com/5950433/198354823-60c51c09-9644-4d1f-a434-9a474b2f5095.png)\r\n\r\n- Supply chain as a network of information that travels across an ecosystem\r\n - Decentralization is natural in supply chains\r\n- https://datatracker.ietf.org/meeting/upcoming\r\n - See below\r\n- Example flow / bare bones model\r\n - When we need the software artifact it's available, it didn't change\r\n - Need better tooling to keep copies in sync\r\n - SCITT will be one of them\r\n - Archiving\r\n - Deployment logs\r\n - Auditing for mitigation and upgrades\r\n- How do we make sure that we never move the cheese on customers and they can roll forward and continue to take advantages of advancements in the future\r\n- https://github.com/ietf-scitt/use-cases/blob/main/scitt-components.md\r\n - More detailed view\r\n - We can fill this out\r\n- ACME Rockets\r\n - Wabbit Networks from example can make internal information public easily\r\n - They might have one SCITT instance that delivers\r\n - They might have one SCITT instance that delivers provenance information to customers about released artifacts\r\n- Each endpoint example: roy.azurecr.io\r\n - Container Registry with signing aligned (azurecr means Azure Container Registry)\r\n - Network boundries complicate permission models\r\n- We need to iron out / document how to do transparent / clean replication\r\n - petnames spec\r\n- Orie: How much detail is in the graph is trust...\r\n - John (unsaid, for the notes only): Trust is for sure not binary, but within a given context that value for the green in the trust graph might become infinitely close to 1.\r\n- Every entity that runs a SCITT instance will have a choice of who they trust\r\n- We want to try to give you a simple solution that\r\n\r\n---\r\n\r\nDRAFT SCITT Agenda, IETF 115, London, UK\r\nDonnerstag, 10. November 2022\r\n09:30 - 11:30\tThursday Session I\r\n \r\n1. Welcome, Agenda Bashing (Chairs, 5 min)\r\n\r\n2. Architecture (TBD, 20 min)\r\ndraft-birkholz-scitt-architecture-02\r\n\r\n2. Software Supply Chain Uses Cases for SCITT (TBD, 30 min)\r\ndraft-birkholz-scitt-software-use-cases-00\r\n\r\n3. Hackathon Report (TBD, 30 min)\r\n\r\n4. SCITT Receipt Report from COSE (TBD, 20 min)\r\n\r\n5. AOB (Open Mic) & Next Steps (Chairs, 15 min)"
}
]
},
{
"body": "# 2022-10-28 Engineering Logs",
"replies": [
{
"body": "- https://twitter.com/0x_philbert/status/1585805986048233472?s=20&t=EQzvXUz0Kz3T-IwKQm2e2Q\r\n- Sequence for mental model docs\r\n - alice as Ghost in brain\r\n - We pick Her out of our head with two fingers\r\n - we ask her\r\n - whoooo\r\n - Are\r\n - Youuuu?\r\n - she helps us look in now that shes out\r\n - We write it all down\r\n - here is where we define the multi context parallel conscious state mental model and map that to the dataflow description\r\n - This is probably also where the draft example sequence (downloder.py) original improve dataflow docs code should go.\r\n - https://github.com/intel/dffml/issues/1279#issuecomment-1025267749\r\n - We give her stack of software pancakes that say EAT me\r\n - She grows to our size"
}
]
},
{
"body": "# 2022-10-29 Engineering Logs",
"replies": [
{
"body": "- https://twitter.com/kelseyhightower/status/1586005703184945152?s=20&t=k6TbZZWA9-0eSSQRO9o10Q \r\n- https://www.princeton.edu/~wbialek/rome/refs/kelly_56.pdf\r\n - Vol 3\r\n - > If the input symbols to a communication channel represent the outcomes of a chance event on which bets are available at odds consistent with their probabilities (i.e., \u201cfair\u201d odds), a gambler can use the knowledge given him by the received symbols to cause his money to grow exponentially. The maximum exponential rate of growth of the gambler\u2019s capital is equal to the rate of transmission of information over the channel. This result is generalized to include the case of arbitrary odds.\r\n >\r\n > Thus we find a situation in which the transmission rate is significant even though no coding is contemplated. Previously this quantity was given significance only by a theorem of Shannon\u2019s which asserted that, with suitable encoding, binary digits could be transmitted over the channel at this rate with an arbitrarily small probability of error.\r\n\r\ndump some offline notes from months ago:\r\n\r\nG 11:6, 3:22\r\n\r\nWe are beginning to accelerate in time as knowledge travels faster. As learning happens faster and taking action on those learnings due to agent parallelization trains of thought executed overlap as aligned. The more system contexts plus state of consciousness (feature data plus overlayed strategic plans) we have the fast time goes relatively in that thread (much like in the animated Hercules, the threads of time, the more twine in the thread the more thread passes through the eye of a needle. The higher the throughput in that thread of time. Since we think in parallel and conceptually but we are only visualizing system contexts plus state of human understood state of consciousness combined as a thread right now, the thread of time the witch holds. That thread represents one persons life. If you look at a persons life as a string which is ever growing so long as they are alive. Say the number of pieces of twine in that string were equal parts divisible by every state of human consciousness we understand they were ever in, so if we did a subset of every state of consciousness we understand as humans, this subset being if they were in deep sleep for 1/4 of their lives, in restless sleep for 1/4, in high alert state for 1/4, and in regular alertness for 1/4. Then we\u2019d see four twines making up the string. If you visualize those as actions, good deeds, bad deeds, then you can classify everything into pieces of twine for either good or bad path and you can see how fast a set of system contexts is progressing in the right ir wring direction. The goal is to progress in the right direction as fast as possible"
}
]
},
{
"body": "# 2022-10-30 Engineering Logs",
"replies": []
},
{
"body": "# 2022-10-31 Engineering Logs",
"replies": [
{
"body": "- https://trendoceans.com/atuin-linux/\r\n- https://docs.google.com/document/d/1xfU_s1Eu51z_WGg5VYBsQtjsKcrV6_TvFXj2WxBcj90/edit\r\n- https://socialhub.activitypub.rocks/pub/guide-for-new-activitypub-implementers\r\n- https://docs.microblog.pub/\r\n- https://raw.githubusercontent.com/rjb4standards/REA-Products/master/jsonvrf.json\r\n- https://github.com/OR13/endor\r\n- https://github.com/w3c/vc-data-model\r\n- https://github.com/bluesky-social/atproto"
}
]
},
{
"body": "# 2022-11-01 Engineering Logs",
"replies": [
{
"body": "## 2022-11-01 @pdxjohnny Engineering Logs\r\n\r\n- https://github.com/w3c/cogai/pull/47\r\n - A [call for contribution](https://www.youtube.com/watch?v=THKMfJpPt8I&list=PLtzAOVTpO2jYt71umwc-ze6OmwwCIMnLw) from the [DFFML Community](https://github.com/intel/dffml/discussions/1406?sort=new) to collaboratively [plan](https://www.youtube.com/watch?v=UIT5Bl3sepk&list=PLtzAOVTpO2jYt71umwc-ze6OmwwCIMnLw) and thereby [manifest](https://github.com/intel/dffml/blob/alice/docs/arch/0008-Manifest.md) description of any system architecture or process flow via the [Open Architecture](https://github.com/intel/dffml/blob/alice/docs/arch/0009-Open-Architecture.rst) methodology, as well as a reference entity, [Alice](https://github.com/intel/dffml/tree/alice/entities/alice/). Their work has a [supply chain security (train of thought security) focus](https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice/0000_architecting_alice).\r\n- https://en.m.wikipedia.org/wiki/L-system\r\n - DNA permutations\r\n - dependcy trees\r\n - Operation valid input prameter setd from cache / seed state\r\n - propositional logic \ud83d\udd4a\ufe0f\r\n- https://github.com/w3c/cogai#cognitive-architecture\r\n - https://github.com/w3c/cogai/blob/master/Contributing.md\r\n - **ALIGNED**\r\n - https://en.wikipedia.org/wiki/ACT-R\r\n - http://act-r.psy.cmu.edu/peoplepages/ja/ja-interests.html\r\n - **ALIGNED** (huh-Huh!)\r\n - http://act-r.psy.cmu.edu/software/\r\n - We can take a look at this for reuse within our InnerSource series\r\n - https://github.com/w3c/cogai/blob/master/minimalist.md\r\n - Very similar to our recent research on graphql-ld\r\n - https://github.com/w3c/cogai/blob/master/faq.md#how-do-chunks-relate-to-rdf-and-property-graphs\r\n- https://github.com/ossf/scorecard#installation\r\n- https://github.com/guacsec/guac/blob/main/SETUP.md\r\n- https://github.com/rqlite/rqlite/blob/master/DOC/RESTORE_FROM_SQLITE.md\r\n- https://github.com/marionebl/svg-term-cli\r\n- Embrace Chaos\r\n - Know Chaos\r\n - Roll with Chaos\r\n\r\n[![EDAC21EB-8311-4E0F-BA9A-D53013109C67](https://user-images.githubusercontent.com/5950433/199291178-7e89705d-f662-44cd-aa3e-e1a24eb61256.jpeg)](https://en.wikipedia.org/wiki/Sophia_(Gnosticism))\r\n\r\n- TODO\r\n - [ ] Circle back with Melvin"
}
]
},
{
"body": "# 2022-11-02 Engineering Logs",
"replies": [
{
"body": "## 2022-11-02 @pdxjohnny Engineering Logs\r\n\r\n- Vol 3: Train of Thought Graffiti\r\n - Making data show up on/in other data traveling over target controlled infra\r\n- https://scitt.io/distributing-with-oci-registries.html\r\n - https://datatracker.ietf.org/wg/scitt/about/\r\n - https://oras.land/\r\n - https://mailarchive.ietf.org/arch/msg/scitt/bOPu8GoZyGWusOOHSFsQq47Xj4Y/\r\n - See below todos on service endpoint\r\n- https://github.com/w3c/cogai/pull/47\r\n- https://www.w3.org/People/Raggett/\r\n - > My current focus is on how to build **AI systems that mimic human reasoning** inspired by decades of advances in the cognitive sciences, and hundreds of millions of years of evolution of the brain. This is a major paradigm shift compared to the Semantic Web which is steeped in the Aristotelian tradition of mathematical logic and formal semantics. This will enable the **Sentient Web** as the combination of sensing, actuation and cognition federated across the Web in support of markets of services based upon open standards.\r\n - **ALIGNED**\r\n - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice/0000_architecting_alice\r\n - > The [W3C Cognitive AI Community Group](https://www.w3.org/community/cogai/) is seeking to incubate ideas that combine symbolic information (graphs) with sub-symbolic information (statistics), rules and high performance graph algorithms. This combination enables machine learning and reasoning in the presence of uncertainty, incompleteness and inconsistencies. The starting point has been the development of the [chunks and rules format](https://github.com/w3c/cogai/blob/master/chunks-and-rules.md) as an amalgam of RDF and Property Graphs. A [series of demos](https://github.com/w3c/cogai/blob/master/demos/README.md) are being developed to explore different aspects, using an open source JavaScript library.\r\n - **ALIGNED**\r\n - https://github.com/intel/dffml/blob/alice/docs/arch/0009-Open-Architecture.rst\r\n- https://www.w3.org/2002/mmi/\r\n- https://www.w3.org/WAI/APA/\r\n- https://web.archive.org/web/20200926173320/http://webinos.org/2011/06/09/webinos-whitepaper/\r\n - > webinos is: a collective project to make the web work for applications. webinos has a vision to build a multi-device, applications platform based on web technology that: \u2013 allows web apps to run seamlessly across multiple devices and to use resources across devices \u2013 allows web applications to communicate with other web applications and (non web components) over multiple device \u2013 links the application experience with the social network \u2013 achieves all of the above in a security preserving manner \u2013 explicitly targets the four distinct \u201cscreens\u201d: the mobile, the PC, the in-car (automotive) and the home media (TV) devices. The intent in webinos is to translate the success of the web as a distributed document publishing system into a successful, distributed applications platform. The webinos platform should be built upon and move forward the required open standards. This platform should have a concrete implementation that is accessible to all as an open source asset. Technically, all of this should be achieved reusing the core development technologies that have already proven themselves on the Web (HTML and JavaScript), affording the benefits of speed of development and access to a large developer talent pool. The innovation webinos brings shall not just be technical; by embracing an open web culture, we hope to create an application framework that does not favour any particular corporation, and on which may parties can collaborate, and from which many companies benefit.\r\n - https://github.com/intel/dffml/blob/3530ee0d20d1062605f82d1f5055f455f8c2c68f/docs/about.rst#philosophy\r\n- https://en.wikipedia.org/wiki/Cognitive_tutor\r\n- https://en.wikipedia.org/wiki/Intelligent_tutoring_system\r\n- TODO\r\n - [ ] Vol 4: Programing as checkers, line up the research so that you can get farther in one turn\r\n - [ ] Time bounded search for research and time to hop (implementation)\r\n - [ ] Demo metric scan with SCITT receipt used to auth upload results to HTTP server (stream of consciousness / webhook server). Root trust in OIDC token similar to fulcio/sigstore github actions slsa demo.\r\n - Future\r\n - [ ] Demo demo to OpenSSF Metrics WG for collaboration on DB\r\n - [ ] Do this for each `Input`\r\n - [ ] Instead of HTTP server the context addressable registry\r\n - [ ] Link via DWNs\r\n - [ ] Hardware rooted keys\r\n - [ ] Kinit above together with a `I/L/R/OP/OPIMPNetwork`s for distributed compute\r\n - [ ] Trust anchors of other than self support\r\n - [ ] Caching\r\n\r\n\r\n---\r\n\r\n- We hope that this work will aid in a heightening of train of thought security posture.\r\n- Our objective is to increase aggregate train of thought security posture.\r\n- Our objective is to increase the aggregate train of thought security\r\n- Supply chain security posture\r\n- The aggregate security of the software supply chain\r\n- The security of the aggregate software supply chain\r\n- The security of the software supply chain in the aggregate\r\n- Heightening of the security of the collective train of thought.\r\n- Heightening of state of art in train of thought security posture.\r\n- We want to secure our thought processes"
},
{
"body": "## 2022-11-02 Harsh/John\r\n\r\n- https://github.com/intel/dffml/issues/596#issuecomment-1301191994\r\n- Installed VS Code build tools and used the developer prompt from there and it worked\r\n- Remembered pipdeptree exists\r\n- We should use https://github.com/tox-dev/pipdeptree and integrate that into shouldi.\r\n\r\n```\r\n -j, --json Display dependency tree as json. This will yield \"raw\"\r\n output that may be used by external tools. This option\r\n overrides all other options.\r\n```\r\n\r\n- https://intel.github.io/dffml/main/examples/shouldi.html\r\n- https://intel.github.io/dffml/main/contributing/dev_env.html\r\n\r\n```console\r\n$ git clone https://github.com/intel/dffml\r\n$ cd dffml\r\n$ python -m venv .venv\r\n$ git checkout -b deptree\r\n$ . .venv/Scripts/activate\r\n$ pip install -e .[dev]\r\n$ cd examples/shouldi\r\n$ pip install -e .[dev]\r\n```\r\n\r\n- https://intel.github.io/dffml/main/api/util/packaging.html#dffml.util.packaging.mkvenv\r\n- https://github.com/tox-dev/pipdeptree#running-in-virtualenvs\r\n\r\nhttps://github.com/intel/dffml/blob/b892cfab9bd152c47a709e8708491c95b8c3ec8e/tests/docs/test_consoletest.py#L14\r\n\r\n- Basic testcase will be to analyze shouldi itself\r\n\r\nhttps://github.com/intel/dffml/blob/3530ee0d20d1062605f82d1f5055f455f8c2c68f/dffml/util/testing/consoletest/commands.py#L83-L190\r\n\r\n- Opens\r\n - Pip not installing to virtualenv we created (using different Python despite our current efforts)\r\n- TODO\r\n - [ ] Harsh to investigate refactoring `ActivateVirtualEnvCommand` into something that doesn't mess with `os.environ` and behaves more like `mkvenv()` (https://github.com/intel/dffml/tree/main/dffml/util/testing/consoletest/)\r\n - [ ] Explicitly use path returned from venv creation as zeroith argument to `dffml.run_command()/subprocess.check_call()`"
}
]
},
{
"body": "# 2022-11-03 Engineering Logs",
"replies": [
{
"body": "## 2022-11-03 @pdxjohnny Engineering Logs\r\n\r\n- https://identity.foundation/presentation-exchange/spec/v2.0.0/\r\n- https://github.com/geyang/plan2vec\r\n- http://tkipf.github.io/\r\n - https://github.com/tkipf/gae\r\n - Graph Auto Encoders\r\n - https://github.com/tkipf/c-swm\r\n - > Contrastive Learning of Structured World Models\r\n > Abstract: A structured understanding of our world in terms of objects, relations, and hierarchies is an important component of human cognition. Learning such a structured world model from raw sensory data remains a challenge. As a step towards this goal, we introduce Contrastively-trained Structured World Models (C-SWMs). C-SWMs utilize a contrastive approach for representation learning in environments with compositional structure. We structure each state embedding as a set of object representations and their relations, modeled by a graph neural network. This allows objects to be discovered from raw pixel observations without direct supervision as part of the learning process. We evaluate C-SWMs on compositional environments involving multiple interacting objects that can be manipulated independently by an agent, simple Atari games, and a multi-object physics simulation. Our experiments demonstrate that C-SWMs can overcome limitations of models based on pixel reconstruction and outperform typical representatives of this model class in highly structured environments, while learning interpretable object-based representations.\r\n- https://filebase.com/blog/5-ipfs-use-cases-you-havent-thought-of-yet/ (or maybe they're exactly what we've thought of ;)\r\n - > 1. Distributed Package Management\r\n > Package managers, like NPM, are typically stored and managed in a centralized manner. By hosting software packages on IPFS, they can be stored in a distributed manner that is publicly available. Any changes to the package\u2019s versions, like a bug fix, will be reflected by a new CID value, allowing for verification of updates and tracking package development.\r\n >\r\n > 2. Hosting Software Containers\r\n > Software containers, like Docker containers, are available through registries like the Docker registry. This is similar to pulling a package from NPM, but for software containers rather than packages. By using IPFS to host your own registry, there isn\u2019t any domain hosting configuration, DNS management, or user permission management. Simply use the IPFS CID with an IPFS HTTP gateway inside a curl command rather than use a docker pull command to download the container\u2019s image.\r\n >\r\n > 3. Decentralized eCommerce websites\r\n > Through packages like DeCommerce, spinning up your own eCommerce website is as simple as uploading the DeCommerce folder to your Filebase bucket, then navigating to the IPFS HTTP gateway URL of your folder\u2019s CID. Since you\u2019re equipped with all the necessary webpages and configurations, you can spend time customizing the CSS files to style your website and upload your products, rather than spending time managing a domain, SSL certificates, or figuring out how to accept crypto payments (which DeCommerce comes equipped with by default!).\r\n >\r\n > 4. Decentralized Operating Systems\r\n > Along with decentralized software packages and containers, decentralized operating systems are another form of software that can benefit from being hosted on IPFS. A handful of decentralized, blockchain-based operating systems have emerged, but storing the data for these operating systems on their native blockchain is typically against best practices since it can be expensive and have high latency. For this reason, many layer-1 blockchains will either store data externally, like on IPFS, or they\u2019ll use a layer-2 chain to handle data storage. Therefore, decentralized operating systems that run on a blockchain can highly benefit from being hosted on IPFS while they communicate externally with the blockchain network.\r\n >\r\n > 5. Decentralized Peer Reviews of Academic Research Papers\r\n > In addition to JPEG art being minted as NFT collections, pieces of writing such as blog posts, eBooks, and whitepapers have begun to gain traction as NFTs as well. Written content benefits from being minted on a blockchain since it verifies who the original writer of the content is, allowing for easier clarification when it comes to copyright, plagiarism, or other duplication of writing. Any text document or Microsoft Word document can be hosted on IPFS and then referenced inside of a smart contract that is deployed on Ethereum or Polygon, creating a permanent record of that piece of writing being created by the author.\r\n > For academic papers, this is a real game changer. Users can mint their research papers as an NFT that uses PDF or text documents hosted on IPFS, and then gain a verifiable reputation for their research and any peer reviews they contribute to other researchers. In addition to the smart contract\u2019s verifiable address, the IPFS CID can be used as an additional form of verification that the content was created by the original author and hasn\u2019t been altered since publication.\r\n- Carbon aware SDK\r\n - https://github.com/Green-Software-Foundation/carbon-aware-sdk\r\n- Metrics for carbon measurement\r\n - Software Carbon Intensity (SCI) - taking action\r\n - Greenhouse Gas Protocol (GHG) - reporting\r\n- Carbon measurement telemetry\r\n - https://github.com/sustainable-computing-io/kepler\r\n - > Kepler (Kubernetes-based Efficient Power Level Exporter) uses eBPF to probe energy related system stats and exports as Prometheus metrics\r\n - https://github.com/hubblo-org/scaphandre\r\n - > Energy consumption metrology agent. Let \"scaph\" dive and bring back the metrics that will help you make your systems and applications more sustainable !\r\n\r\n```console\r\n$ pip install -e entities/alice\r\n$ dffml service dev entrypoints list dffml.overlays.alice.please.log.todos\r\nOverlayCLI = alice.please.log.todos.todos:OverlayCLI -> alice 0.0.1 (/home/pdxjohnny/.local/lib/python3.9/site-packages)\r\nOverlayRecommendedCommunityStandards = alice.please.log.todos.todos:AlicePleaseLogTodosDataFlowRecommendedCommnuityStandardsGitHubIssues -> alice 0.0.1 (/home/pdxjohnny/.local/lib/python3.9/site-packages)\r\n$ dffml service dev export -configloader json alice.cli:AlicePleaseLogTodosCLIDataFlow | tee logtodos.json\r\n$ (echo '```mermaid' && dffml dataflow diagram logtodos.json && echo '```') | gh gist create -f \"LOG_TODOS_DATAFLOW_DIAGRAM.md\" -\r\n```\r\n\r\n- Oneliner: `dffml service dev export -configloader json alice.cli:AlicePleaseLogTodosCLIDataFlow | tee logtodos.json && (echo '```mermaid' && dffml dataflow diagram logtodos.json && echo '```') | gh gist create -f \"LOG_TODOS_DATAFLOW_DIAGRAM.md\" -`\r\n\r\n\r\n```mermaid\r\ngraph TD\r\nsubgraph a759a07029077edc5c37fea0326fa281[Processing Stage]\r\nstyle a759a07029077edc5c37fea0326fa281 fill:#afd388b5,stroke:#a4ca7a\r\nsubgraph d9f2c7ced7f00879629c15363c8e307d[alice.please.log.todos.todos.AlicePleaseLogTodosDataFlow:guess_repo_string_is_url]\r\nstyle d9f2c7ced7f00879629c15363c8e307d fill:#fff4de,stroke:#cece71\r\n37178be7db9283b44a1786fef58ffa8d[alice.please.log.todos.todos.AlicePleaseLogTodosDataFlow:guess_repo_string_is_url]\r\n5c7743e872c165030dcf051c712106fc(repo_string)\r\n5c7743e872c165030dcf051c712106fc --> 37178be7db9283b44a1786fef58ffa8d\r\n8d32e3f614b2c8f9d23e7469eaa1da12(result)\r\n37178be7db9283b44a1786fef58ffa8d --> 8d32e3f614b2c8f9d23e7469eaa1da12\r\nend\r\nsubgraph ed8e05e445eabbcfc1a201e580b1371e[alice.please.log.todos.todos.AlicePleaseLogTodosDataFlow:guessed_repo_string_is_operations_git_url]\r\nstyle ed8e05e445eabbcfc1a201e580b1371e fill:#fff4de,stroke:#cece71\r\nf129d360149fb01bbfe1ed8c2f9bbaa2[alice.please.log.todos.todos.AlicePleaseLogTodosDataFlow:guessed_repo_string_is_operations_git_url]\r\n77a8695545cb64a7becb9f50343594c3(repo_url)\r\n77a8695545cb64a7becb9f50343594c3 --> f129d360149fb01bbfe1ed8c2f9bbaa2\r\nd259a05785074877b9509ed686e03b3a(result)\r\nf129d360149fb01bbfe1ed8c2f9bbaa2 --> d259a05785074877b9509ed686e03b3a\r\nend\r\nsubgraph 0fb0b360e14eb7776112a5eaff5252de[alice.please.log.todos.todos.OverlayCLI:cli_has_repos]\r\nstyle 0fb0b360e14eb7776112a5eaff5252de fill:#fff4de,stroke:#cece71\r\n81202a774dfaa2c4d640d25b4d6c0e55[alice.please.log.todos.todos.OverlayCLI:cli_has_repos]\r\n7ba42765e6fba6206fd3d0d7906f6bf3(cmd)\r\n7ba42765e6fba6206fd3d0d7906f6bf3 --> 81202a774dfaa2c4d640d25b4d6c0e55\r\n904eb6737636f1d32a6d890f449e9081(result)\r\n81202a774dfaa2c4d640d25b4d6c0e55 --> 904eb6737636f1d32a6d890f449e9081\r\nend\r\nsubgraph 964c0fbc5f3a43fce3f0d9f0aed08981[alice.please.log.todos.todos.OverlayCLI:cli_is_meant_on_this_repo]\r\nstyle 964c0fbc5f3a43fce3f0d9f0aed08981 fill:#fff4de,stroke:#cece71\r\nb96195c439c96fa7bb4a2d616bbe47c5[alice.please.log.todos.todos.OverlayCLI:cli_is_meant_on_this_repo]\r\n2a071a453a1e677a127cee9775d0fd9f(cmd)\r\n2a071a453a1e677a127cee9775d0fd9f --> b96195c439c96fa7bb4a2d616bbe47c5\r\nf6bfde5eece6eb52bb4b4a3dbc945d9f(result)\r\nb96195c439c96fa7bb4a2d616bbe47c5 --> f6bfde5eece6eb52bb4b4a3dbc945d9f\r\nend\r\nsubgraph 2e2e8520e9f9420ffa9e54ea29965019[alice.please.log.todos.todos.OverlayCLI:cli_run_on_repo]\r\nstyle 2e2e8520e9f9420ffa9e54ea29965019 fill:#fff4de,stroke:#cece71\r\nf60739d83ceeff1b44a23a6c1be4e92c[alice.please.log.todos.todos.OverlayCLI:cli_run_on_repo]\r\n0ac5645342c7e58f9c227a469d90242e(repo)\r\n0ac5645342c7e58f9c227a469d90242e --> f60739d83ceeff1b44a23a6c1be4e92c\r\n6e82a330ad9fcc12d0ad027136fc3732(result)\r\nf60739d83ceeff1b44a23a6c1be4e92c --> 6e82a330ad9fcc12d0ad027136fc3732\r\nend\r\nsubgraph b8e0594907ccea754b3030ffc4bdc3fc[alice.please.log.todos.todos:gh_issue_create_support]\r\nstyle b8e0594907ccea754b3030ffc4bdc3fc fill:#fff4de,stroke:#cece71\r\n6aeac86facce63760e4a81b604cfab0b[alice.please.log.todos.todos:gh_issue_create_support]\r\ndace6da55abe2ab1c5c9a0ced2f6833d(file_present)\r\ndace6da55abe2ab1c5c9a0ced2f6833d --> 6aeac86facce63760e4a81b604cfab0b\r\nd2a58f644d7427227cefd56492dfcef9(repo)\r\nd2a58f644d7427227cefd56492dfcef9 --> 6aeac86facce63760e4a81b604cfab0b\r\n7f2eb20bcd650dc00cde5ca0355b578f(issue_url)\r\n6aeac86facce63760e4a81b604cfab0b --> 7f2eb20bcd650dc00cde5ca0355b578f\r\nend\r\nsubgraph cd002409ac60a3eea12f2139f2743c52[alice.please.log.todos.todos:git_repo_to_git_repository_checked_out]\r\nstyle cd002409ac60a3eea12f2139f2743c52 fill:#fff4de,stroke:#cece71\r\ne58ba0b1a7efba87321e9493d340767b[alice.please.log.todos.todos:git_repo_to_git_repository_checked_out]\r\n00a9f6e30ea749940657f87ef0a1f7c8(repo)\r\n00a9f6e30ea749940657f87ef0a1f7c8 --> e58ba0b1a7efba87321e9493d340767b\r\nbb1abf628d6e8985c49381642959143b(repo)\r\ne58ba0b1a7efba87321e9493d340767b --> bb1abf628d6e8985c49381642959143b\r\nend\r\nsubgraph d3ec0ac85209a7256c89d20f758f09f4[check_if_valid_git_repository_URL]\r\nstyle d3ec0ac85209a7256c89d20f758f09f4 fill:#fff4de,stroke:#cece71\r\nf577c71443f6b04596b3fe0511326c40[check_if_valid_git_repository_URL]\r\n7440e73a8e8f864097f42162b74f2762(URL)\r\n7440e73a8e8f864097f42162b74f2762 --> f577c71443f6b04596b3fe0511326c40\r\n8e39b501b41c5d0e4596318f80a03210(valid)\r\nf577c71443f6b04596b3fe0511326c40 --> 8e39b501b41c5d0e4596318f80a03210\r\nend\r\nsubgraph af8da22d1318d911f29b95e687f87c5d[clone_git_repo]\r\nstyle af8da22d1318d911f29b95e687f87c5d fill:#fff4de,stroke:#cece71\r\n155b8fdb5524f6bfd5adbae4940ad8d5[clone_git_repo]\r\need77b9eea541e0c378c67395351099c(URL)\r\need77b9eea541e0c378c67395351099c --> 155b8fdb5524f6bfd5adbae4940ad8d5\r\n8b5928cd265dd2c44d67d076f60c8b05(ssh_key)\r\n8b5928cd265dd2c44d67d076f60c8b05 --> 155b8fdb5524f6bfd5adbae4940ad8d5\r\n4e1d5ea96e050e46ebf95ebc0713d54c(repo)\r\n155b8fdb5524f6bfd5adbae4940ad8d5 --> 4e1d5ea96e050e46ebf95ebc0713d54c\r\n6a44de06a4a3518b939b27c790f6cdce{valid_git_repository_URL}\r\n6a44de06a4a3518b939b27c790f6cdce --> 155b8fdb5524f6bfd5adbae4940ad8d5\r\nend\r\nsubgraph 98179e1c9444a758d9565431f371b232[dffml_operations_innersource.operations:code_of_conduct_present]\r\nstyle 98179e1c9444a758d9565431f371b232 fill:#fff4de,stroke:#cece71\r\nfb772128fdc785ce816c73128e0afd4d[dffml_operations_innersource.operations:code_of_conduct_present]\r\nf333b126c62bdbf832dddf105278d218(repo)\r\nf333b126c62bdbf832dddf105278d218 --> fb772128fdc785ce816c73128e0afd4d\r\n1233aac886e50641252dcad2124003c9(result)\r\nfb772128fdc785ce816c73128e0afd4d --> 1233aac886e50641252dcad2124003c9\r\nend\r\nsubgraph d03657cbeff4a7501071526c5227d605[dffml_operations_innersource.operations:contributing_present]\r\nstyle d03657cbeff4a7501071526c5227d605 fill:#fff4de,stroke:#cece71\r\n8da2c8a3eddf27e38838c8b6a2cd4ad1[dffml_operations_innersource.operations:contributing_present]\r\n2a1ae8bcc9add3c42e071d0557e98b1c(repo)\r\n2a1ae8bcc9add3c42e071d0557e98b1c --> 8da2c8a3eddf27e38838c8b6a2cd4ad1\r\n52544c54f59ff4838d42ba3472b02589(result)\r\n8da2c8a3eddf27e38838c8b6a2cd4ad1 --> 52544c54f59ff4838d42ba3472b02589\r\nend\r\nsubgraph da39b149b9fed20f273450b47a0b65f4[dffml_operations_innersource.operations:security_present]\r\nstyle da39b149b9fed20f273450b47a0b65f4 fill:#fff4de,stroke:#cece71\r\nc8921544f4665e73080cb487aef7de94[dffml_operations_innersource.operations:security_present]\r\ne682bbcfad20caaab15e4220c81e9239(repo)\r\ne682bbcfad20caaab15e4220c81e9239 --> c8921544f4665e73080cb487aef7de94\r\n5d69c4e5b3601abbd692ade806dcdf5f(result)\r\nc8921544f4665e73080cb487aef7de94 --> 5d69c4e5b3601abbd692ade806dcdf5f\r\nend\r\nsubgraph 062b8882104862540d584516edc60008[dffml_operations_innersource.operations:support_present]\r\nstyle 062b8882104862540d584516edc60008 fill:#fff4de,stroke:#cece71\r\n5cc75c20aee40e815abf96726508b66d[dffml_operations_innersource.operations:support_present]\r\nf0e4cd91ca4f6b278478180a188a2f5f(repo)\r\nf0e4cd91ca4f6b278478180a188a2f5f --> 5cc75c20aee40e815abf96726508b66d\r\n46bd597a57e034f669df18ac9ae0a153(result)\r\n5cc75c20aee40e815abf96726508b66d --> 46bd597a57e034f669df18ac9ae0a153\r\nend\r\nend\r\nsubgraph a4827add25f5c7d5895c5728b74e2beb[Cleanup Stage]\r\nstyle a4827add25f5c7d5895c5728b74e2beb fill:#afd388b5,stroke:#a4ca7a\r\nend\r\nsubgraph 58ca4d24d2767176f196436c2890b926[Output Stage]\r\nstyle 58ca4d24d2767176f196436c2890b926 fill:#afd388b5,stroke:#a4ca7a\r\nend\r\nsubgraph inputs[Inputs]\r\nstyle inputs fill:#f6dbf9,stroke:#a178ca\r\n6e82a330ad9fcc12d0ad027136fc3732 --> 5c7743e872c165030dcf051c712106fc\r\n8d32e3f614b2c8f9d23e7469eaa1da12 --> 77a8695545cb64a7becb9f50343594c3\r\n128516cfa09b0383023eab52ee24878a(seed<br>dffml.util.cli.CMD)\r\n128516cfa09b0383023eab52ee24878a --> 7ba42765e6fba6206fd3d0d7906f6bf3\r\n128516cfa09b0383023eab52ee24878a(seed<br>dffml.util.cli.CMD)\r\n128516cfa09b0383023eab52ee24878a --> 2a071a453a1e677a127cee9775d0fd9f\r\n904eb6737636f1d32a6d890f449e9081 --> 0ac5645342c7e58f9c227a469d90242e\r\nf6bfde5eece6eb52bb4b4a3dbc945d9f --> 0ac5645342c7e58f9c227a469d90242e\r\n46bd597a57e034f669df18ac9ae0a153 --> dace6da55abe2ab1c5c9a0ced2f6833d\r\nbb1abf628d6e8985c49381642959143b --> d2a58f644d7427227cefd56492dfcef9\r\n4e1d5ea96e050e46ebf95ebc0713d54c --> 00a9f6e30ea749940657f87ef0a1f7c8\r\nd259a05785074877b9509ed686e03b3a --> 7440e73a8e8f864097f42162b74f2762\r\nd259a05785074877b9509ed686e03b3a --> eed77b9eea541e0c378c67395351099c\r\na6ed501edbf561fda49a0a0a3ca310f0(seed<br>git_repo_ssh_key)\r\na6ed501edbf561fda49a0a0a3ca310f0 --> 8b5928cd265dd2c44d67d076f60c8b05\r\n8e39b501b41c5d0e4596318f80a03210 --> 6a44de06a4a3518b939b27c790f6cdce\r\nbb1abf628d6e8985c49381642959143b --> f333b126c62bdbf832dddf105278d218\r\nbb1abf628d6e8985c49381642959143b --> 2a1ae8bcc9add3c42e071d0557e98b1c\r\nbb1abf628d6e8985c49381642959143b --> e682bbcfad20caaab15e4220c81e9239\r\nbb1abf628d6e8985c49381642959143b --> f0e4cd91ca4f6b278478180a188a2f5f\r\nend\r\n```\r\n\r\n```console\r\n$ alice please log todos -log debug -repos https://github.com/pdxjohnny/testaaa\r\n```\r\n\r\n- Got `alice please log todos` (slimmed down version of `alice please contribute`) working https://github.com/intel/dffml/commit/adf32b4e80ad916de7749fc0b6e99485fb4107b7\r\n - This will allow us to not deal with the pull request code unless triggered.\r\n - Without the overlay infra complete it's harder to remove ops / modify flows than it is to add to them (static overlay application is what we have and is easy, it's just auto flow the definitions together)\r\n- TODO\r\n - [ ] Added `alice please log todos` command adf32b4e80ad916de7749fc0b6e99485fb4107b7\r\n - [ ] Find tutorial location for this, maybe just with data flows stuff\r\n- Future\r\n - [ ] Alice refactor and optimize for reduced carbon emissions\r\n - [ ] Integrate into PR feedback loop"
}
]
},
{
"body": "# 2022-11-04 Engineering Logs",
"replies": [
{
"body": "## 2022-11-04 @pdxjohnny Engineering Logs\r\n\r\n- Issue Ops as a way for people to request Alice pull requests, contributions, interaction, etc.\r\n - https://github.com/valet-customers/issue-ops/blob/6a5e64188ae79dfd11613f5f9bdc75f7b769812b/.github/workflows/issue_ops.yml\r\n - https://github.com/valet-customers/issue-ops/blob/6a5e64188ae79dfd11613f5f9bdc75f7b769812b/.github/ISSUE_TEMPLATE/gitlab_ci.md\r\n- How do we communicate and document when there is new data available or we plan to make new data available.\r\n- How do we uqyer and correlate across sources?\r\n- VEX (JSON-LD?)\r\n - Statuses\r\n - Investigating\r\n - Vulnerable\r\n - Used but not vulnerable\r\n - This version is vuln (to vuln or dep vuln) but we have another one that's not effected\r\n - We will need to establish chains of trust on top of VDR / VEX issuance\r\n - https://cyclonedx.org/capabilities/vdr/#bom-with-embedded-vdr\r\n - https://www.nist.gov/itl/executive-order-14028-improving-nations-cybersecurity/software-security-supply-chains-software-1\r\n - https://cyclonedx.org/capabilities/vex/\r\n - https://energycentral.com/c/pip/what-nist-sbom-vulnerability-disclosure-report-vdr\r\n - https://github.com/CycloneDX/bom-examples/blob/master/SaaSBOM/apigateway-microservices-datastores/bom.json\r\n- InnerSource\r\n - https://innersourcecommons.org/learn/patterns/\r\n - https://github.com/InnerSourceCommons/InnerSourcePatterns\r\n - https://www.youtube.com/watch?v=RjBpZKsAQN0\r\n - A RedMonk Conversation: IBM's Inner Source transformation, scaling a DevOps culture change.\r\n- GitHub Actions\r\n - https://docs.github.com/en/actions/using-jobs/using-concurrency#example-only-cancel-in-progress-jobs-or-runs-for-the-current-workflow\r\n - https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#concurrency\r\n- https://code-as-policies.github.io/\r\n - Need to look into this more\r\n - https://colab.research.google.com/drive/1V9GU70GQN-Km4qsxYqvR-c0Sgzod19-j\r\n - https://ai.googleblog.com/2022/11/robots-that-write-their-own-code.html\r\n - https://web1.eng.famu.fsu.edu/~mpf/research.htm\r\n - > Central to this approach is hierarchical code generation, which prompts language models to recursively define new functions, accumulate their own libraries over time, and self-architect a dynamic codebase.\r\n - Yup\r\n- https://twitter.com/MikePFrank/status/1588539750423547905\r\n - Reversible Computing\r\n - Essentially what we get when we cache our flows plus all our equilibrium reaching time travel stuff (synchronization of system contexts across disparate roots, aka cherry picking patches and A/B validation of results until we reach desired state)\r\n - https://en.wikipedia.org/wiki/Reversible_computing\r\n- http://hiis.isti.cnr.it/serenoa/project-fact-sheet.html\r\n - Some similar principles to ours\r\n - > - New concepts, languages, (intelligent) runtimes and tools are needed to support multi-dimensional context-aware adaptation of SFEs. h ese artefacts will enable SFE engineers to concentrate on the functionality rather than on the implementation details concerning the adaptation to the multiple dimensions of the context of use.\r\n > - Keeping Humans in the Loop. h is principle is twofold. On the one hand, end users should be able to provide feedback or even guide the adaptation process according to their preferences or previous experiences with the system. On the other hand, authors, developers and engineers should be able to guide the adaptation process according to their experience and domain knowledge.\r\n > - Open Adaptiveness. A system is open adaptive \u201cif new adaptation plans can be introduced during runtime\u201d. - Adaptation in ubiquitous computing environments (such as in ambient spaces) is also necessary in order to deal with multiple devices, interaction resources and modalities.\r\n > - Covering the full adaptation lifecycle to support a full adaptation life-cycle that will result into feedback loops (coming from end users) in order to inform any future adaptation\r\n\r\n```python\r\nasync def gh_issue_create_if_file_not_present(\r\n repo_url: str,\r\n file_present: bool,\r\n title: str,\r\n body: str,\r\n logger: logging.Logger,\r\n) -> Dict[str, str]:\r\n if file_present:\r\n return\r\n return {\r\n \"issue_url\": await gh_issue_create(\r\n repo_url,\r\n title,\r\n body,\r\n logger=logger,\r\n )\r\n }\r\n\r\n\r\n\"\"\"\r\ndef make_gh_issue_create_opimp_for_file(\r\n filename: str,\r\n file_present_definition,\r\n default_title: str,\r\n body: str,\r\n):\r\n IssueTitle = NewType(filename + \"IssueTitle\", str)\r\n IssueBody = NewType(filename + \"IssueBody\", str)\r\n IssueURL = NewType(filename + \"IssueURL\", str)\r\n\r\n # TODO,\r\n # NOTE dffml.op requires name set in overlay classes for now\r\n\r\n return new_types, opimp\r\n\"\"\"\r\n\r\n\r\n# : dffml_operations_innersource.operations.FileReadmePresent\r\nclass AlicePleaseLogTodosDataFlowRecommendedCommnuityStandardsGitHubIssues:\r\n @dffml.op(\r\n inputs={\r\n \"repo\": dffml_feature_git.feature.definitions.git_repository_checked_out,\r\n \"file_present\": dffml_operations_innersource.operations.FileSupportPresent,\r\n \"title\": SupportIssueTitle,\r\n \"body\": SupportIssueBody,\r\n },\r\n outputs={\r\n \"issue_url\": NewType(\"SupportIssueURL\", str),\r\n },\r\n )\r\n async def gh_issue_create_support(\r\n repo: dffml_feature_git.feature.definitions.git_repository_checked_out.spec,\r\n file_present: bool,\r\n title: str,\r\n body: str,\r\n ) -> Dict[str, str]:\r\n return await gh_issue_create_if_file_not_present(\r\n repo.URL,\r\n file_present,\r\n title,\r\n body,\r\n logger=self.logger,\r\n )\r\n\r\n\r\n\"\"\"\r\ncls = AlicePleaseLogTodosDataFlowRecommendedCommnuityStandardsGitHubIssues\r\nfor new_types, opimp in itertools.starmap(\r\n make_gh_issue_create_opimp_for_file,\r\n [\r\n (\"Support\", dffml_operations_innersource.operations.FileSupportPresent),\r\n (\"Contributing\", dffml_operations_innersource.operations.FileContributingPresent),\r\n (\"CodeOfConduct\", dffml_operations_innersource.operations.FileCodeOfConductPresent),\r\n (\"Security\", dffml_operations_innersource.operations.FileSecurityPresent),\r\n ],\r\n):\r\n setattr(cls, opimp.op.name, )\r\n for new_type in new_types:\r\n print(new_type, new_type.__dict__)\r\n\"\"\"\r\n```\r\n\r\n- alice: please: log: todos: recommended community standard: support: github issue: Allow for title and body override\r\n - 67d79ede39629f3b117be0d9f2b5058f88b4efcb\r\n- e2ed7faaa alice: please: log: todos: recommended community standard: code of conduct: github issue: Log issue if file not found\r\n- 8b0df460a alice: please: log: todos: recommended community standard: contributing: github issue: Log issue if file not found\r\n- dbb946649 alice: please: log: todos: recommended community standard: security: github issue: Log issue if file not found\r\n- 59d3052f9 alice: please: log: todos: recommended community standard: Cleanup comments\r\n- 5dbadaf36 operations: innersource: Check for README community health file\r\n- d867a9cda alice: please: log: todos: recommended community standard: readme: github issue: Log issue if file not found\r\n\r\n![image](https://user-images.githubusercontent.com/5950433/200097693-4207fe5c-6d0d-4bfb-8d75-d57bd5768616.png)\r\n\r\n![image](https://user-images.githubusercontent.com/5950433/200098670-1085a185-71af-4193-b5ca-5740d42c952d.png)\r\n\r\n- Ran the three most recent Alice commands to confirm everything is still working\r\n - `alice shouldi contribute`\r\n - `alice please log todos`\r\n - `alice please contribute recommended community standards`\r\n\r\n```console\r\n$ alice -log debug shouldi contribute -keys https://github.com/pdxjohnny/testaaa\r\n$ alice please log todos -log debug -keys https://github.com/pdxjohnny/testaaa\r\n$ alice please contribute -repos https://github.com/pdxjohnny/testaaa -log debug -- recommended community standards\r\n```\r\n\r\n- 7980fc0c7 util: cli: cmd: Add DFFMLCLICMD NewType for use in data flows\r\n- 6d0ce54e1 cli: dataflow: run: records: Allow for passing CLI CMD instance to data flow as input\r\n- 0356b97a9 alice: cli: please: contribute: recommended community standards: Use CLI CMD type from dffml\r\n- 3e8b161a2 alice: cli: please: log: todos: Use CLI CMD type from dffml\r\n- 7c7dd8f7c alice: cli: please: log: todos: Base off dffml dataflow run records\r\n- 1d4d6b2f8 alice: cli: please: log: todos: Explictly pass directory when finding last repo commit\r\n- TODO\r\n - [ ] SaaSBOM etc. overlays for dataflows for `THREATS.md` analysis\r\n - https://github.com/CycloneDX/bom-examples/tree/6990885/SaaSBOM/apigateway-microservices-datastores\r\n - [ ] Find a cleaner way to do same operation reused with different definitions (and defaults)"
}
]
},
{
"body": "# 2022-11-05 Engineering Logs",
"replies": [
{
"body": "- https://pretalx.com/pycascades-2023/cfp\r\n- Vol 0: Alice is a Sign not a Cop\r\n - mention conceptual cultural opamp effects of any change (wheel, ML). Information travels faster as a result of some changes.\r\n - grep Wardly map alignment reward strategics plan hypothesis think"
}
]
},
{
"body": "# 2022-11-06 Engineering Logs",
"replies": [
{
"body": "## 2022-11-06 @pdxjohnny Engineering Logs\r\n\r\n- RosettaNet EDI\r\n- https://www.youtube.com/watch?v=ToihJtuELwM\r\n - Methodology for long term storage of verifiable credentials encoded to vol 3 plus vol 5 aware text as prompt for best practices for trust graph inference strategic plan high accuracy for adherence to goals with regards to happiness metrics and fail safe ad-hoc group forming.\r\n- https://colab.research.google.com/drive/1Hl0xxODGWNJgcbvSDsD5MN4B2nz3-n7I?usp=sharing#scrollTo=GDlskFoGYDVt\r\n - GPT-3 but better\r\n - Flan: grep: EAT me (few days ago in this thread) perfect\r\n - this engagement fits in with visualization of software stack of pancakes to grow Alice \ud83e\udd5e (if this works we will hopefully start accelerating quickly, as we accelerate time for her slows)\r\n - Summary of the following (Alice thread) in the style of a avxrh or whatever paper\r\n - Concept of open architecture as an IETF RFC: ^\r\n - Install Flan and associated DFFML overlays within OS DecentrAlice.\r\n - What is a Large Language Model?\r\n - LLMs essentially act as intelligent lookup tables where the promt is like the SQL query\r\n - See gather_inputs call within memory context method, implement prioritizer there (dont try to refactor into dataflow as class first!)\r\n- https://comunica.github.io/comunica-feature-link-traversal-web-clients/builds/default/#datasources=https://foaf-ldux.vercel.app/&query=PREFIX%20foaf:%20%3Chttp://xmlns.com/foaf/0.1/%3E%0ASELECT%20%20DISTINCT%20?Name%20%3FWebID%20WHERE%20%7B%0A%20%20%3Chttps%3A%2F%2Ffoaf-ldux.vercel.app%2F%23me%3E%20foaf%3Aknows%20%3FWebID.%0A%20%20%3FWebID%20foaf%3Aname%20%3FName.%0A%7D&httpProxy=https%3A%2F%2Fproxy.linkeddatafragments.org%2F\r\n - https://twitter.com/mfosterio/status/1589368256086781952\r\n - https://github.com/comunica/comunica\r\n - https://gist.github.com/rubensworks/9d6eccce996317677d71944ed1087ea6\r\n - Grapql-LD\r\n - \ud83d\udee4\ufe0f\u26d3\ufe0f\ud83d\ude84\r\n - > Linked Data on the Web exists in many shapes and forms. Linked Data can be published using plain RDF files in various syntaxes, such as JSON-LD, Turtle, HTML+RDFa, and more. Next to that, different forms of queryable Web interfaces exist, such as SPARQL endpoints and Triple Pattern Fragments (TPF) interfaces. If we want to query Linked Data from the Web, we need to be able to cope with this heterogeneity. Comunica is a quering framework that has been designed to handle different types of Linked Data interfaces in a flexible manner. Its primary goal is executing SPARQL queries over one or more interfaces. Comunica is a meta-query engine Comunica should not be seen as a query engine. Instead, Comunica is a meta query engine using which query engines can be created. It does this by providing a set of modules that can be wired together in a flexible manner. While we provide default configurations of Comunica to easily get started with querying, anyone can configure their own query engine. This fine-tuning of Comunica to suit your own needs, and avoiding the overhead of modules that are not needed.\r\n - We want to combine this with SCITT\r\n - https://github.com/lacanoid/pgsparql\r\n- https://dust.tt/\r\n - Looks like data flow/notebook hybrid! Cool! But closed source APIs is what are available so far.\r\n- https://colab.research.google.com/drive/1PDT-jho3Y8TBrktkFVWFAPlc7PaYvlUG?usp=sharing\r\n - Ebook Embeddings Search\r\n- https://www.themarginalian.org/2022/11/02/anais-nin-d-h-lawrence/\r\n - > Life is a process of becoming, a combination of states we have to go through. Where people fail is that they wish to elect a state and remain in it. This is a kind of death.\r\n- https://www.themarginalian.org/2014/11/11/dostoyevsky-dream/\r\n - > All are tending to one and the same goal, at least all aspire to the same goal, from the wise man to the lowest murderer, but only by different ways. It is an old truth, but there is this new in it: I cannot go far astray. I saw the truth. I saw and know that men could be beautiful and happy, without losing the capacity to live upon the earth. I will not, I cannot believe that evil is the normal condition of men\u2026 I saw the truth, I did not invent it with my mind. I saw, saw, and her living image filled my soul for ever. I saw her in such consummate perfection that I cannot possibly believe that she was not among men. How can I then go astray? \u2026 The living image of what I saw will be with me always, and will correct and guide me always. Oh, I am strong and fresh, I can go on, go on, even for a thousand years.\r\n > [\u2026]\r\n > And it is so simple\u2026 The one thing is \u2014 love thy neighbor as thyself \u2014 that is the one thing. That is all, nothing else is needed. You will instantly find how to live.\r\n- Extensible Dynamic Edge Network (EDEN)\r\n - https://magicmirror.builders/\r\n - https://android-developers.googleblog.com/2019/02/an-update-on-android-things.html\r\n - Fuck, they cut the project, that's okay we'll maybe run TockOS (lol, tick tock, appropriate :)\r\n - https://github.com/tock/tock\r\n\r\n![eden](https://user-images.githubusercontent.com/5950433/200349932-91555c81-38cf-4a90-9074-fea92a6aa974.jpeg)\r\n"
}
]
},
{
"body": "# 2022-11-07 Engineering Logs\r\n\r\n- IPVM meeting tomorrow on content addressable execution\r\n - https://ipfs.tech/\r\n - https://www.youtube.com/watch?v=FhwzEKNZEIA\r\n - https://www.youtube.com/watch?v=rzJWk1nlYvs\r\n - See recent notes on content addressable `serviceEndpoint` defined via dataflows pinned by `did:merkle:`\r\n - https://atproto.com/guides/data-repos\r\n- Zephyr\r\n - What is at the top of the build parameter hierarchy\r\n - They use a Kconfig system\r\n - They could use overlays for this\r\n - Firmware build because it's embedded it more build time configs\r\n - How do we organize storage?\r\n - The Knowledge graph and data flows to link to describe those other flat structures\r\n - Need unique build ids\r\n - `did:merkle:` of serialized Open Architecture\r\n - They only ever run a few subsets of Kconfig parameter sets (a few parameters)\r\n - Parameters are any inputs that can effect the build\r\n - Tool chain version\r\n - Marc's example\r\n - Let's say I care about, git version ,tool chain version, various .config\r\n - https://github.com/zephyrproject-rtos/zephyr/pull/51954#issuecomment-1302983454\r\n - I track those for reproducability (and caching) information\r\n - When I want to generate a content addressable build I take all those JSON files (which are the generic graph serisalization of all the stuff you care about) you concat and checksum (`did:merkle:`).",
"replies": [
{
"body": "## 2022-11-07 @pdxjohnny Engineering Logs\r\n\r\n- KCP Edge\r\n - https://github.com/kcp-dev/edge-mc\r\n - Goal: bridge with DID / DWN / serviceEndpoint / DIDComm / Data Flows for arbitrary comms.\r\n - > edge-mc is a subproject of kcp focusing on concerns arising from edge multicluster use cases:\r\n > - Hierarchy, infrastructure & platform, roles & responsibilities, integration architecture, security issues\r\n > - Runtime in[ter]dependence: An edge location may need to operate independently of the center and other edge locations\u200b\r\n > - Non-namespaced objects: need general support\r\n > - Cardinality of destinations: A source object may propagate to many thousands of destinations. \u200b \r\n - Released 3-4 days ago? Chaos smiles on us again :)\r\n - Perfect for EDEN (vol 0: traveler of the edge)\r\n - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_preface.md#volume-0-architecting-alice\r\n - We want to bridge KCP edge-mc with \r\n- https://sohl-dickstein.github.io/2022/11/06/strong-Goodhart.html\r\n- System Context\r\n - Stumbled upon \"valid system context\" stuff (I/O must existing / be mapped)\r\n - https://youtu.be/m0TO9IOqRfQ?t=3812&list=PLtzAOVTpO2jaHsS4o-sDzDyHEug-1KRbK\r\n - https://github.com/intel/dffml/blob/1d4d6b2f817cd987ceff94b4984ce909b7aa3c7f/dffml/df/system_context/system_context.py#L101-L103\r\n- https://atproto.com/guides/data-repos\r\n - We will serialize to ATP when available / more Python\r\n support / obvious what is happening there.\r\n- RosettaNet\r\n - https://github.com/MicrosoftDocs/biztalk-docs/tree/main/biztalk/adapters-and-accelerators/accelerator-rosettanet\r\n - https://github.com/MicrosoftDocs/biztalk-docs/blob/main/biztalk/adapters-and-accelerators/accelerator-rosettanet/TOC.md\r\n - https://github.com/Azure/logicapps/blob/master/templates/rosettanet-encode-response.json\r\n - This looks like it would be good for CI/CD test status in DID land\r\n - As a bridge to tbDEX\r\n- Hitachi if truly powering good is aligned\r\n- https://github.com/SchemaStore/schemastore\r\n- GitHub Actions\r\n - https://docs.github.com/en/developers/webhooks-and-events/webhooks/webhook-events-and-payloads#discussion_comment\r\n - https://docs.github.com/en/actions/using-workflows/events-that-trigger-workflows\r\n - https://docs.github.com/en/actions/using-workflows/triggering-a-workflow#available-events\r\n- Flan T5\r\n - https://colab.research.google.com/drive/1Hl0xxODGWNJgcbvSDsD5MN4B2nz3-n7I?usp=sharing#scrollTo=GDlskFoGYDVt\r\n - Paid $9.99 to have access to high memory environment (12GB was not enough for the first import code block)\r\n - It won't generate long form answers :(\r\n - [2022-11-06 @pdxjohnny Engineering Logs](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4068656)\r\n - Summary of the following (Alice thread) in the style of a avxrh or whatever paper\r\n - Commit messages from patch diffs\r\n\r\n```python\r\ninput_text = \"\"\"\r\nWrite a peer reviewed scientific paper on the Eiffel Tower:\r\n\"\"\"\r\n\r\ndef generate_long(input_text):\r\n input_ids = tokenizer(input_text, return_tensors=\"pt\").input_ids.to(\"cuda\")\r\n output = model.generate(input_ids, max_new_tokens=100000000)\r\n return [tokenizer.decode(i, skip_special_tokens=True) for i in output]\r\n\r\ngenerate_long(input_text)\r\n```\r\n\r\n- TODO\r\n - [ ] Enable detection of recommended community standards in `docs` and `.github`\r\n - https://docs.github.com/en/communities/setting-up-your-project-for-healthy-contributions/adding-support-resources-to-your-project\r\n - [x] Headphones\r\n - [x] Craigslist $50: Bose QuietComfort 15\r\n - I've been wanting these headphones for, what, 12+ years,\r\n turns out I could have just gone on craigslist at any point.\r\n - [x] [STRFKR - Open Your Eyes](https://www.youtube.com/watch?v=mkeOoWquAqk&list=RDEMwZ9tKHt9iT5CWajVqMu11w)\r\n - [x] CHADIG\r\n - [ ] JavaScript GitHub Actions runner idea still good for use case of automating communications via client side execution of runner / flows.\r\n - [ ] Implemented via extension or script or console copy/paste or background service worker or something. This allows you to do the incremental addition to the Extensible Dynamic Edge Network (EDEN).\r\n - Just remembered I found out about solar punk semi-recently\r\n - didme.me\r\n - DWN looks similar to this? REally unclear where impelmentation is at or what hooks are"
}
]
},
{
"body": "# 2022-11-08 Engineering Logs",
"replies": [
{
"body": "## 2022-11-08 @pdxjohnny Engineering Logs\r\n\r\n- https://arbesman.substack.com/p/-revisiting-the-world-of-simulation\r\n- Rewatching videos to better understand how to make `did:merkle:` cached execution + an image + caching results of `alice please summarize discussion <default to alice discusion> --after \"2022-10-01 00:00+0000\" --before \"2022-11-01 00:00+0000\"` run summarization of each day (configurability on summarization of bullet point settings using Flan (\ud83e\udd5e EAT Me :) )\r\n - Not sure what to say for October monthly progress report :P\r\n - Pretty soon Alice can just generate herself a video and post it for us\r\n - https://www.youtube.com/watch?v=u2ZyqX-9xk8&list=PLtzAOVTpO2jaHsS4o-sDzDyHEug-1KRbK&t=2640\r\n - reference to go through the http gateway for ipfs and so this is the type of thing that we're going to have the visibility into you know we would store things yeah in ipfs or we would probably actually store things in an operation which will then yield us something\r\n- Abandoned watching the old streams of consciousness and went to didme.me\r\n - Ran into https://github.com/transmute-industries/verifiable-data/tree/main/packages/jsonld-schema#related-projects again\r\n - Found https://w3c-ccg.github.io/traceability-vocab/#VerifiableScorecard ! Which is exactly what we want for some cases (`alice shouldi`, static analysis).\r\n - https://w3c-ccg.github.io/traceability-vocab/#BillOfLadingCredential Can we use this for execution + content address / `did:merkle:` of inputs as described for Zephyr use case / our 2nd Part use case?\r\n - > A transport document issued or signed by a carrier evidencing a contract of carriage acknowledging receipt of cargo. This term is normally reserved for carriage by vessel (marine or ocean bill of lading) or multimodal transport. All B/Ls must indicate the date of issue, name of shipper and place of shipment, place of delivery, description of goods, whether the freight charges are prepaid or collected, and the carrier's signature. A bill of lading is, therefore, both a receipt for merchandise and a contract to deliver it as freight. (source: Olegario Llamazares: Dictionary Of International Trade, Key definitions of 2000 trade terms and acronyms).\r\n - This sounds like something that could be a compute contract as well.\r\n - https://w3c-ccg.github.io/traceability-vocab/openapi/components/schemas/common/BillOfLading.yml\r\n - Beautiful, let's roll with this and modify it into something with less names and places and more DIDs.\r\n- IPVM\r\n - Meeting invite\r\n - > Get up-to-date information at: https://lu.ma/event/evt-0op04xDSoAUBseQ?pk=g-JBsGh2GPRyVgKwn\r\n >\r\n > Click to join: https://lu.ma/join/g-JBsGh2GPRyVgKwn\r\n >\r\n > Event Information:\r\n >\r\n > This call is open to all, but is focused on implementers, following the IETF's rough \"consensus and running code\" ethos.\r\n > The IPVM is an effort to add content-addressed computation to IPFS. The requires specifying calling convention, distributed scheduling, session receipts, mobile computing, and auto-upgradable IPFS internals.\r\n > Links\r\n > - Community Calls \r\n > - GitHub Org\r\n > - Discord Channel \r\n > - IPFS \u00feing '22 Slides\r\n - https://fission.codes/blog/ipfs-thing-breaking-down-ipvm/\r\n - https://twitter.com/pdxjohnny/status/1574975274663706624\r\n - > FISSIONCodes: You've heard of \r\n[@IPFS](https://mobile.twitter.com/IPFS), but what about IPVM? Fission is working on the Interplanetary Virtual Machine - a way to add content-addressed computation to IPFS. \ud83e\udd2f With content-addressed computation we can work more efficiently and save time and compute power, all while operating in the decentralized web.\r\n - > John: With regards to bindings and interface discussion. The Open Architecture currently is looking at software definition via manifests and data flows. Dynamic context aware overlays are then used to enable deployment specific analysis, synthesis, and runtime evaluation. This allows for decoupling from the underlying execution environment (i.e. WASM). Traversing metadata graphs on code from remote sources allows for orchestration sandboxing to be dynamic, context aware configurable, and negotiable for the execution of compute contract. This methodology is work in progress. Binding generation (syscalls, etc.) should follow the same overlay enabled pattern. Calling convention here is effectively the (Credential) Manifest.\r\n - https://github.com/intel/dffml/blob/alice/docs/arch/0009-Open-Architecture.rst\r\n - https://intel.github.io/dffml/main/about.html#what-is-key-objective-of-dataflows\r\n - [2022-11-07 Engineering Logs](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4073154)\r\n - @marc-hb Zephyr example\r\n - Let's say I care about, git version ,tool chain version, various .config\r\n - https://github.com/zephyrproject-rtos/zephyr/pull/51954#issuecomment-1302983454\r\n - I track those for reproducibility (and caching) information\r\n - DID based content addressable solution possibility\r\n - When I want to generate a content addressable build I take all those JSON files (which are the generic graph serialization of all the stuff you care about) you concat and checksum which for a graph of DIDs is `did:merkle:`.\r\n - Side note: Could do root of Open Architecture upstream could be referenced as as `did:merkle:`. So Alice's state of the art value for upstream on `Architecting Alice: An Image` would be `upstream: \"did:merkle:123\"`\r\n - [2022-11-02 @pdxjohnny Engineering Logs](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4037309)\r\n - Demo metric scan with SCITT receipt used to auth upload results to HTTP server (stream of consciousness / webhook server). Root trust in OIDC token similar to fulcio/sigstore github actions slsa demo.\r\n - Future\r\n - [ ] Demo demo to OpenSSF Metrics WG for collaboration on DB\r\n - [ ] Do this for each `Input`\r\n - [ ] Instead of HTTP server the context addressable registry\r\n - [ ] Link via DWNs\r\n - [ ] Hardware rooted keys\r\n - [ ] Kinit above together with a `I/L/R/OP/OPIMPNetwork`s for distributed compute\r\n - [ ] Trust anchors of other than self support\r\n - [ ] Caching\r\n - Can we build a quick demo this morning on top of\r\n https://github.com/imjasonh/kontain.me for discussions sake?\r\n - https://go.dev/learn/\r\n - https://go.dev/doc/install\r\n - https://go.dev/doc/tutorial/getting-started\r\n - https://go.dev/doc/modules/managing-dependencies#naming_module\r\n\r\n```console\r\n$ git clone https://github.com/imjasonh/kontain.me\r\n$ cd kontain.me/\r\n$ export GO111MODULE=on\r\n$ export GOPROXY=\"${HTTPS_PROXY}\"\r\n```\r\n\r\n- QUIC\r\n - https://youtu.be/Dp6FwEfkBqQ\r\n - https://youtu.be/wN9O1MnxIig\r\n- MC Alice\r\n - https://www.youtube.com/playlist?list=PLtzAOVTpO2jYzHkgXNjeyrPFO9lDxBJqi\r\n\r\n```console\r\n$ youtube-dl --no-call-home --no-cache-dir -x --audio-format mp3 --add-metadata --audio-quality 0 --restrict-filenames --yes-playlist --ignore-errors \"https://www.youtube.com/watch?v=Bzd3BjXHjZ0&list=PLtzAOVTpO2jYzHkgXNjeyrPFO9lDxBJqi\"\r\n```\r\n\r\n- Aghin already got us started webhooks!\r\n - https://intel.github.io/dffml/main/examples/webhook/index.html\r\n - > Aghin, one of our GSoC 2020 students, wrote operations and tutorials which allow users to receive web hooks from GitHub and re-deploy their containerized models and operations whenever their code is updated.\r\n - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0005_stream_of_consciousness.md\r\n- TODO\r\n - [ ] Update `Architecting Alice: Stream of of Consciousness` using webhook demo as upstream."
},
{
"body": "## 2022-11-08 IPVM November Meeting Notes\r\n\r\n- Brooklyn Leading\r\n- **TODO** Link recording\r\n- Agenda\r\n - Updates\r\n - Convos in Lisbon\r\n - Discussion\r\n- Last month didn't happen due to busy-ness\r\n- Lisbon\r\n - Folks on this call were there in person for network labs week\r\n - Talked about IPVM and other topics\r\n - How to plug into other systems\r\n - How it's different than other things\r\n - IPVM got a grant, some funding, there is community faith\r\n - First step is to work on invocation spec\r\n - If we do a good job then in the next week or so it can serve as a basis for a few diffrrenet projects\r\n - BucketVM\r\n - UCAN based invocation\r\n - WarpForge\r\n - Build system, sets up linux sandbox then does deterministic builds (not WASM)\r\n - Goals: Build libc form source\r\n - Possibly aligned\r\n - Catalogs and formulas\r\n - Optimine?\r\n - Nondeterministic computation in docker containers\r\n - Getting existing workloads running\r\n - They have a golang based configuration\r\n - IPVM is less interested in ditributed algs and more interseted in doing fast WASM\r\n- How is interop being planned?\r\n - IPVM wants to be fully deterministic, cached, verifiable\r\n - Often need to resolve IPNS link, send email, etc. do \"off chain\"\r\n - WASI is one way to do that\r\n - That's not deterministic, you can do traced deception and read sth stream in but you can't parallelize and compare results\r\n - If you use a managed effect system, you leave all the impure stuff to the runtime\r\n - Do you have access to run this? Yes? Just log a yes on you have access to run that effect.\r\n - Effects incoming run before WASM, effects outgoing\r\n - Sounds very similar to OA\r\n - https://github.com/intel/dffml/blob/alice/docs/arch/0009-Open-Architecture.rst\r\n - https://github.com/intel/dffml/blob/main/docs/about.rst#what-is-key-objective-of-dataflows\r\n - Example Effect: Operation invocation manifest, it calls back in using the input effect.\r\n - If there are chunks then they can call into IPVM and it can use the\r\n - Effects are like input events in DFFML dataflows\r\n - Affinity\r\n - I already have this cached, you should send me these effect\r\n - I have a GPU\r\n - Related: EDEN - [2022-11-08 @pdxjohnny Engineering Logs]()\r\n - Brooklyn has been laying out and thinking about what's reasonable\r\n - Data pipelines, composable out of existing jobs\r\n - Can tell it to run things concurrently\r\n - Dataflows are nice for this, dimond validation came up as an example\r\n - Issues: JSON due to DAG\r\n - There is as draft PR in the repo which says let's just name all the jobs\r\n - https://github.com/ipvm-wg/spec/pull/8\r\n - There might be a multi value output\r\n - This is static invocation, we know ahead of time this is the level of parallelism\r\n - You might have an output which invokes more jobs\r\n- Ideally, here's a UCAN, please do it\r\n - There is already a place for authorizations\r\n - In a UCAN, you have all the info you need to say please run this\r\n - Sometimes people will add `invoke:true`, it's unclear if you should be able to delegate.\r\n - Another approach is to put a think wrapper, you can rip off the auth part and wrap a new one\r\n- Irakli\r\n - CID of WASM with data in, not invocation by CID, but invocation by mutable pointer?\r\n - Brooklyn says ya we want multiple pointers?\r\n - There is a before block in the invocation, do this effect as an input, then place that and that gets a name.\r\n - How do define interfaces?\r\n - https://radu-matei.com/blog/intro-wasm-components/ might get into major interfaces soon\r\n - Challenge of links outside of IPLD\r\n - Need to have some native notion of \"I'm reading 9TB data but I have to read in blocks\" needs to read off of streams and emit streams\r\n - Autocodec inside of IPVM usually makes sense\r\n - Instead of baking in JSON and CBOR and protobuf and all these thing, we just pass around WASM and say run this on these blocks of data, it's like ebpf, it's dynamic\r\n - To get their webfilesystem to show in a gateway they had to do a bunmch of hacks right now\r\n - If you put it in IPVM then you can just reuse that as the distributed compute method\r\n- What happens when a user creates one of these? How do we put syntactic sugar on top.\r\n - How do we look at caching?\r\n- Non-goal: Support WASI right off the bat\r\n - WASM allows us to restrict what will be run with effects\r\n - Putting all effects on outside then WASM always allows us to use\r\n - They want to replace FaaS stuff with distributed compute **ALIGNED**\r\n - Fission goals: Decentralized open functions as a service, small short deterministic data flow, simple image transformations, etc.\r\n- Coming from erlang/elixr world\r\n - What happens when there is an issue how does erlang supervision pattern apply and failure cases / states for dags, how do we filter off into declarative specs based on locality\r\n - Not sure if giving people the choice of supervisor pattern is the right choice\r\n - We should come up with the secure by default (giving people to modify supervision patterns has been a loss for erlang)\r\n - With great power comes great responsibility, supervision is the correct concept, IPVM could be opinionated\r\n - Affinity, this depends on that, defined failure modes with overlays?\r\n - Look at k8s affinity and anti-affinity patterns\r\n - Please go to another node\r\n - WASM is a pure function with pure data (deterministic)\r\n - People want things that look like objects or actors\r\n - You can build that around this!\r\n - It will look like eventual consistency or software transaction memory\r\n - If you need locking then can use effects and soforth to land where you need\r\n- IPVM we want an analysis step, I'm going to reorder, come up with the dependency tree, (then overlay failure modes possible?)\r\n - Failure modes defined as effects?\r\n- IPVM as a distributed scheduler\r\n - Borrow VM and compiler tricks (if on a single threaded machine run that dispatch rest)\r\n - Can look at \"gas\" costs (distributed compute cost, ref: Ethereum https://ethereum.org/en/developers/docs/gas/)\r\n- Melanie: Microkernel\r\n - From chat: There is always a minimal set of functions application code need to communicate with the system- in our case we care about IPLD blocks. Is there a way to define affinity, so if a node has executed a command, loaded the IPFS in its cache, it\u2019s more likely to get the next job with same base data?. Looks like it could be done outside Wasm. I'd like to say IPVM host code is close ish to a microkernel that ships with a kernel that can be pasted on modules when they get run to provide a better interface *to the system cals\r\n - Looking to have effectivly this syscall style interface which you can referecnce for CID\r\n - Works on filecoin VM, using WASM and micro kernel appraoch has been useful\r\n- Autocodec sounds similar to a WASM version of shim\r\n - https://github.com/intel/dffml/pull/1273\r\n - here to replace dag-cbor, dag-cb, running over dags of different types\r\n\r\n---\r\n\r\nSource: [docs/arch/alice/discussion/0023/reply_0044.md](https://github.com/intel/dffml/discussions/1369#discussioncomment-2778357)\r\n\r\n- https://hexdocs.pm/flow/Flow.html\r\n - Elixir send the function where the data is, so it takes care of scheduling based on locality\r\n - Has comms at base layer\r\n - OTP - erlang is a glorified supervision tree\r\n - Can hook into this to issue commands to erlang VMs, gives you fault tolerence\r\n - Can run this over web3\r\n - It can manage how it fails\r\n - Backpressure is watching the infinate stream and it's monitoring and watching and detecting if it's oversubscribing the resources available\r\n - People are using elixir with rust\r\n - We deploy an elixir app\r\n - We give a stream of data to the pipeline\r\n - The produce plucks the head of the stream for the processes downstrema to do their work and it will stich the data bcak togethere. I twill partiion the data in parallel and then \r\n - If your process crashes, the supervision tree decides what to do (strategic plans)\r\n - Model in elixir is crash, then supervisers break down\r\n - Broadway is what is producing the events, flow is what\r\n - Supervision tree could initaite fail fast patterns\r\n - Discord uses elixir at the proxy and then rust for proecessing"
}
]
},
{
"body": " # 2022-11-09 Engineering Logs\r\n\r\n- Workstreams\r\n - [ ] Knowledge graph sharing (basics)\r\n - [ ] Provide queriable data via? JSON-LD static file serves to start?\r\n - [ ] Implement initial dumps to chosen format via DFFML plugin patches for first integration.\r\n - [ ] Query via GraphQL-LD (https://github.com/comunica/comunica)\r\n - [ ] Data security from [SCITT](https://scitt.io)\r\n - [ ] Identity from probably github.com/user.keys or keybase or QR code (HSM on phone) or other (overlayed?) methods.\r\n - [ ] Distributed Execution\r\n - [ ] Sandboxing\r\n - [ ] Overlays (next phase parsers) for `policy.yml` to define what are acceptable sandboxing criteria (annotation to the chosen orchestrator, aka the sandboxing method / manager during execution).\r\n - Overlays to parse more types of available sandboxing mechanisms and determine how much we like them or not.\r\n - [ ] Reference implementation of content addressable compute contract execution using Decentralized Identifier, Verifiable Credential, and Decentralized Web Node based for layer 7/8?.\r\n - [ ] Entity Analysis Trinity\r\n - [ ] Static Analysis\r\n - [ ] Need to understand dependencies\r\n - [ ] Living Threat Models\r\n - [ ] `THREATS.md` talks about and includes maintainance / lifecycle health (recommended community standards at minimum). \r\n - Related: https://github.com/johnlwhiteman/living-threat-models/issues/1\r\n - [ ] Open Architecture\r\n - [ ] Conceptual upleveling of dependencies into architecture via static overlay with architecture or overlay to synthesize.\r\n - [ ] Feedback loop\r\n - [ ] Stream of Consciousness\r\n - #1315\r\n - https://github.com/w3c/websub\r\n - https://youtu.be/B5kHx0rGkec\r\n - 12 years, this has existed for 12 years, how am I just now finding out about this.\r\n - we want this but callbacks supported as data flows / open architecture / use webrtc to call back.\r\n - http://pubsubhubbub.appspot.com/\r\n - [ ] Implement Gatekeeper (`get_operations()`/`gather_inputs()`)\r\n - [ ] Overlays / schema extensions for `policy.yml` which prioritizer\r\n understands how to leverage.\r\n - [ ] Implement Prioritizer (`get_operations()`/`gather_inputs()`)\r\n - [ ] Interfaces\r\n - [ ] Keeping GitHub workflows up to date\r\n - Usages of reusables templated and updated on trigger from upstream\r\n or template or within context config modifications.",
"replies": [
{
"body": "## 2022-11-09 @pdxjohnny Engineering Logs\r\n\r\n- https://github.com/w3c/websub/tree/master/implementation-reports\r\n - https://github.com/marten-de-vries/Flask-WebSub\r\n - Publisher client with Verifiable Credentials and Credential Manifests\r\n - https://identity.foundation/credential-manifest/#credential-requirement-discovery\r\n - A Verifiable Credential is then issued\r\n - https://w3c-ccg.github.io/traceability-vocab/#BillOfLadingCredential\r\n - https://w3c-ccg.github.io/traceability-vocab/openapi/components/schemas/credentials/BillOfLadingCredential.yml\r\n - https://w3c-ccg.github.io/traceability-vocab/openapi/components/schemas/common/BillOfLading.yml\r\n - QEMU, then firecracker, let's see how fast she'll roll\r\n- https://hub.docker.com/r/exampleorg/uni-resolver-driver-did-example\r\n - https://github.com/decentralized-identity/universal-resolver/pull/100/files\r\n - https://github.com/decentralized-identity/universal-resolver/blob/main/docs/driver-development.md\r\n - https://github.com/decentralized-identity/universal-resolver/blob/main/docker-compose.yml\r\n- time is relative by locality\r\n - clustering state of art / train of thought field it falls into grep twine threads\r\n- https://github.com/ArtracID/ArtracID-DID-ART-Method\r\n - Can we combine this with didme.me / SCITT? Art world has similar data provenance supply chain fundamentals of authenticity attestations.\r\n - `did:art:alice:<did merkle dag of manifest / upstream>`\r\n - See \"Architecting Alice: An Image\"\r\n- https://jena.apache.org/tutorials/sparql_data.html\r\n- https://linkeddatafragments.org/software/#server\r\n- https://github.com/benj-moreau/odmtp-tpf#sparql-queries-over-github-api\r\n- TODO\r\n - [ ] Modify BillOfLadingVC schema into something with less names and places and more DIDs.\r\n - https://w3c-ccg.github.io/traceability-vocab/openapi/components/schemas/common/BillOfLading.yml\r\n - [ ] Play with https://github.com/benj-moreau/odmtp-tpf#sparql-queries-over-github-api as backend and GraphQL-LD to query\r\n - [2022-11-06 @pdxjohnny Engineering Logs](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4068656)\r\n- https://share.tube/videos/local\r\n - https://joinpeertube.org/instances\r\n - Does this work / exist for streaming? It seems more and more hybrid federated principles / web5 is looking like our web2 -> web5 brdige\r\n- https://fission.codes/blog/webnative-app-template/\r\n- https://octodon.social/@cwebber/109307940669755800\r\n- https://www.w3.org/TR/activitypub/\r\n - This overview tutorial might be the right base for our POC of sharing data flow / knowledge graphs\r\n- TODO\r\n - [ ] https://www.w3.org/TR/activitypub/ (+DERP optionally maybe tunneled over webrtc) for stream of consciousness input network on \"shared\" exec\r\n - [ ] Fix DFFML build pipelines and build a container to submit using HTTP service data flow endpoint config as DID resolver for `did:oa:`\r\n - [ ] Let's maybe mess with https://github.com/mastodon/mastodon/blob/main/docker-compose.yml and see if we can start talking to Alice via that.\r\n - [ ] Then we gradually add in DID, VC, etc. to that\r\n - [x] Install Linux on SSD\r\n - [ ] Mouse's wheel is broken, need a new mouse\r\n - It doesn't even do the drag to scroll anymore on fedora 36"
}
]
},
{
"body": "# 2022-11-10 Engineering Logs\r\n\r\n- Tomorrow\r\n - https://github.com/microsoft/scitt-api-emulator\r\n - https://github.com/microsoft/scitt-ccf-ledger/blob/main/pyscitt/pyscitt/did.py\r\n - https://atproto.com/guides/lexicon#schema-format",
"replies": [
{
"body": "## 2022-11-10 @pdxjohnny Engineering Logs\r\n\r\n- Current focus is around leveraging threat model and architecture information to engage in automated context informed proactive, reactive, or periodic (tech debt cleanup) mitigation activities. This is in pursuit of enabling decentralized gamification / continuous improvement of the security lifecycle / posture of open source projects. Enabling them to overlay their custom logic on upstream OSS analysis and policy evaluation will ideally increase helpfulness of static and dynamic analysis and automated remediation.\r\n - https://gist.github.com/pdxjohnny/07b8c7b4a9e05579921aa3cc8aed4866\r\n - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/README.md#rolling-alice-volume-0-introduction-and-context\r\n - \"Snapshot of System Context\" here is content addressable execution\r\n - [2022-11-08 @pdxjohnny Engineering Logs](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4086860)\r\n- https://github.com/TimothyClaeys/pycose\r\n- https://medium.com/transmute-techtalk/neo4j-graph-data-science-with-verifiable-credential-data-98b806f2ad78\r\n - I saw this the other day and should have dug more\r\n- https://w3c.github.io/sync-media-pub/\r\n- Poly repo pull model dev tooling rubric into issues into pull request review for inclusion in 2nd or 3rd party set (or any manifest or within any overlay, just change tracking but rubric assisted for distributed checking see SCITT OpenSSF use case with mention of VEX/VDR/SBOM).\r\n- https://github.com/decentralized-identity/credential-manifest/issues/125#issuecomment-1278620849\r\n - https://identity.foundation/presentation-exchange/#input-evaluation\r\n - Similar to [2022-11-07 Engineering Logs](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4073154)\r\n - System Context\r\n - Stumbled upon \"valid system context\" stuff (I/O must existing / be mapped)\r\n - https://youtu.be/m0TO9IOqRfQ?t=3812&list=PLtzAOVTpO2jaHsS4o-sDzDyHEug-1KRbK\r\n - https://github.com/intel/dffml/blob/1d4d6b2f817cd987ceff94b4984ce909b7aa3c7f/dffml/df/system_context/system_context.py#L101-L103\r\n- https://github.com/CycloneDX/bom-examples/tree/master/VEX/CISA-Use-Cases\r\n- https://github.com/hadolint/hadolint\r\n- https://github.com/sahlberg/fuse-nfs\r\n- https://socialhub.activitypub.rocks/pub/guide-for-new-activitypub-implementers\r\n- Lets just try implementing ATP\r\n - https://atproto.com/guides/lexicon#schema-format\r\n - ATP + SCITT! APT + SCITT! **APT + SCITT!**\r\n- XRPC looks like similar to IPVM woth effects\r\n - https://atproto.com/specs/xrpc\r\n- (websub + OA) + ATP (Data repos)\r\n - SCITT becomes identity help (notary) and format of message encapsulated in ATP in this case trust chains established via context / content analysis of ATP message (maybe contains a jwk)\r\n- https://github.com/w3c/activitystreams/blob/master/implementation-reports/activipy.md\r\n- https://github.com/microsoft/unilm\r\n - https://github.com/microsoft/unilm/tree/master/edgelm\r\n - > We evaluate EdgeFormer on the benchmarks of three popular seq2seq tasks: CoNLL-14 for GEC, XSUM for Abstractive Summarization, and SQuAD-NQG for Question Generation.\r\n - https://github.com/microsoft/unilm/tree/master/adalm\r\n - https://github.com/microsoft/unilm/tree/master/layoutlmv3\r\n - Manifest->screenshot\r\n- https://github.com/w3c/activitystreams/blob/master/implementation-reports/annotation-protocol-server.md\r\n - Inventory-esq #1207\r\n- `curl --url-query name@file https://example.com`\r\n - https://daniel.haxx.se/blog/2022/11/10/append-data-to-the-url-query/\r\n- https://activipy.readthedocs.io/en/latest/about.html#what-is-activitystreams-how-might-it-help-me\r\n - > And simple is good, because let\u2019s face it, most users of most web application APIs are like poor Billy Scripter, a kid who has some scripting language like Ruby or Python or Javascript and some JSON parser in a toolbox and that\u2019s about it. Billy Scripter knows how to parse JSON pulled down from some endpoint, and that\u2019s about all he knows how to do. Poor Billy Scripter! But it\u2019s okay, because ActivityStreams is simple enough that Billy can make it by. And because the [ActivityStreams Core](http://www.w3.org/TR/activitystreams-core/) serialization specifies that the [ActivityStreams Vocabulary](http://www.w3.org/TR/activitystreams-vocabulary/) is always implied and that those terms must always be available, Billy will always know what a [Like](http://www.w3.org/TR/activitystreams-vocabulary/#dfn-like) object or a [Note](http://www.w3.org/TR/activitystreams-vocabulary/#dfn-note) means. Horray for Billy!\r\n- TODO\r\n - [ ] John, it's VDR and VEX, don't overcomplicate it, you can reference via DID later, stop getting distracted by shinny DIDs\r\n - Remember it was always the initial plan to use this as the stream interface, maybe add websub\r\n - https://docs.oasis-open.org/csaf/csaf/v2.0/csaf-v2.0.html\r\n - https://www.oasis-open.org/committees/tc_home.php?wg_abbrev=csaf\r\n - CSAF is the overarching framework VEX fits into\r\n - The SBOM almost acts like the `@context` for JSON-LD\r\n - Do what you know, don't forget about `cve-bin-tool`, maybe find notes on prototyping that flow, maybe we should just do that based on binary analysis of project.\r\n - Then use learnings to do Python packages / shouldi deptree\r\n - Okay I forgot that might have also been the original plan, stick with the plan.\r\n - [ ] VEX via simple HTTP service https://github.com/CycloneDX/bom-examples/tree/master/VEX/CISA-Use-Cases\r\n - Future\r\n - [ ] Updates via websub\r\n- Future\r\n - [ ] wecsub stream of consciousness to facilitate fetchibg new VEX/VDR\r\n - [ ] websub over DIDComm callback exec via open architecture\r\n - [ ] VEX/VDR/SBOM/SCITT via ATP\r\n - [ ] https://github.com/sahlberg/fuse-nfs userspace (GitHub Actions) proxy\r\n over DERP to NFS spun up via dispatch (communicate across multiple jobs).\r\n - [ ] Check for updates to crednetial manifest thread: https://github.com/decentralized-identity/credential-manifest/issues/125#issuecomment-1310728595\r\n - [ ] [2022-11-10 SCITT API Emulator Spin Up](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4110695)"
},
{
"body": "## 2022-11-10 SCITT Technical Meeting\r\n\r\n- https://armltd.zoom.us/j/95609091197?pwd=V3NndVF1WGZzNUJDUGUzcEVWckxOdz09\r\n- Software use case is one of many, came up many times in discussion in London.\r\n- Lot of work got done over the weekend during hackathon.\r\n- SCITT API emulator\r\n- https://github.com/microsoft/scitt-api-emulator\r\n - Also running confidential consortium ledger\r\n - https://github.com/microsoft/scitt-ccf-ledger\r\n - https://github.com/microsoft/scitt-ccf-ledger/tree/main/demo/github\r\n - https://github.com/microsoft/scitt-ccf-ledger/blob/main/pyscitt/pyscitt/did.py\r\n\r\n![provenance_for_the_chaos_God](https://user-images.githubusercontent.com/5950433/201148302-325c58a6-166d-494b-b162-5feaea557d87.jpg)"
},
{
"body": "## 2022-11-10 SCITT API Emulator Spin Up\r\n\r\n[The Alice thread continues!](https://mastodon.social/@pdxjohnny/109320563491316354)\r\nWe take one step further towards decentralization as we federate our way away from Twitter.\r\n\r\nToday we're playing with SCITT and ATProto: https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4104302\r\n\r\nPrev: https://twitter.com/pdxjohnny/status/1585488415864557568\r\n\r\n### SCITT (virtual) CCF Spin Up\r\n\r\nWe have liftoff with virtual confidential ledger (not really using SGX).\r\n\r\n- https://github.com/microsoft/scitt-ccf-ledger\r\n- https://github.com/microsoft/scitt-ccf-ledger/tree/main/demo/github\r\n- https://github.com/microsoft/scitt-ccf-ledger/blob/main/pyscitt/pyscitt/did.py\r\n- https://asciinema.org/a/536774\r\n\r\n```console\r\n$ unxz -d - < ~/asciinema/DESKTOP-3LLKECP-rec-2022-11-10T08:52:20-08:00.json.xz | tee /tmp/scitt-ccf-ledger.json\r\n$ cat /tmp/scitt-ccf-ledger.json | python -m asciinema play -s 20 -\r\n$ python -m asciinema upload /tmp/scitt-ccf-ledger.json \r\n```\r\n\r\n[![asciicast](https://asciinema.org/a/536709.svg)](https://asciinema.org/a/536709)\r\n\r\n### 2022-11-14 SCITT API Emulator Spin Up\r\n\r\n- References\r\n - https://github.com/microsoft/scitt-api-emulator/blob/2502eda6b99936a7b28792ca3fd6ba9fbf97e7ba/README.md\r\n\r\n```console\r\n$ git clone https://github.com/microsoft/scitt-api-emulator\r\n$ cd scitt-api-emulator\r\n$ git ls-files | xargs -I '{}' -- sed -i 's/python3.8/python3.10/g' '{}'\r\n$ python -m rich.markdown README.md\r\n$ ./scitt-emulator.sh server --workspace workspace/ --tree-alg CCF\r\nSetting up Python virtual environment.\r\n[notice] A new release of pip available: 22.2.2 -> 22.3.1\r\n[notice] To update, run: pip install --upgrade pip\r\nService private key written to workspace/storage/service_private_key.pem\r\nService parameters written to workspace/service_parameters.json\r\nService parameters: workspace/service_parameters.json\r\n * Serving Flask app 'scitt_emulator.server'\r\n * Debug mode: on\r\nWARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead.\r\n * Running on all addresses (0.0.0.0)\r\n * Running on http://127.0.0.1:8000\r\n * Running on http://192.168.1.115:8000\r\nPress CTRL+C to quit\r\n * Restarting with stat\r\nService parameters: workspace/service_parameters.json\r\n * Debugger is active!\r\n * Debugger PIN: 000-000-000\r\n```\r\n\r\n- Ran commands from `README.md`\r\n\r\n```console\r\n$ ./scitt-emulator.sh server --workspace workspace/ --tree-alg CCF\r\n$ ./scitt-emulator.sh client create-claim --issuer did:web:example.com --content-type application/json --payload '{\"sun\": \"yellow\"}' --out claim.cose\r\n$ ./scitt-emulator.sh client submit-claim --claim claim.cose --out claim.receipt.cbor\r\n$ ./scitt-emulator.sh client retrieve-claim --entry-id 1 --out claim.cose\r\n$ ./scitt-emulator.sh client retrieve-receipt --entry-id 1 --out receipt.cbor\r\n$ ./scitt-emulator.sh client verify-receipt --claim claim.cose --receipt claim.receipt.cbor --service-parameters workspace/service_parameters.json\r\n```\r\n\r\n- It works!\r\n\r\n> The `verify-receipt` command verifies a SCITT receipt given a SCITT claim and a service parameters file. This command can be used to verify receipts generated by other implementations.\r\n>\r\n> The `service_parameters.json` file gets created when starting a service using `./scitt-emulator.sh server`. The format of this file is not standardized and is currently:\r\n>\r\n> ```json\r\n> {\r\n> \"serviceId\": \"emulator\",\r\n> \"treeAlgorithm\": \"CCF\",\r\n> \"signatureAlgorithm\": \"ES256\",\r\n> \"serviceCertificate\": \"-----BEGIN CERTIFICATE-----...\"\r\n> }\r\n> ```\r\n\r\n- We upload `alice shouldi contribute` dataflow to SCITT and get a receipt!\r\n - Friends, today is a great day. :railway_track:\r\n - Next stop, serialization / federation with Alice / Open Architecture serialization data flow as SCITT service.\r\n\r\n[![asciicast](https://asciinema.org/a/537643.svg)](https://asciinema.org/a/537643)"
}
]
},
{
"body": "# 2022-11-11 Engineering Logs",
"replies": [
{
"body": "## 2022-11-11 @pdxjohnny Engineering Logs\r\n\r\n- https://fluxcd.io/flux/guides/image-update/\r\n - Possible `FROM` rebuild chain helper\r\n- https://github.com/Xubuntu/lightdm-gtk-greeter-settings\r\n - https://github.com/Xubuntu/lightdm-gtk-greeter-settings/issues/4#issuecomment-1312059288\r\n - Same on Fedora 37\r\n - Root cause was permissions issue, needs to be world readable and\r\n all directories which are parents need to be world readable as\r\n well. Moved file from `/root` to `/opt/wallpapers/` and ensured\r\n permissions were correct.\r\n - ![reproduced-on-fedora-37-launchpad-lightdm-gtk-greeter-settings-bug-1593986](https://user-images.githubusercontent.com/5950433/201404906-c7f5d800-a803-4005-bfbf-129c2f45a096.png)\r\n\r\n```console\r\n$ sudo mkdir /opt/wallpapers/\r\n$ sudo stat /opt/wallpapers/\r\n File: /opt/wallpapers/\r\n Size: 27 Blocks: 0 IO Block: 4096 directory\r\nDevice: 253,1 Inode: 9450093 Links: 2\r\nAccess: (0755/drwxr-xr-x) Uid: ( 0/ root) Gid: ( 0/ root)\r\nContext: unconfined_u:object_r:usr_t:s0\r\nAccess: 2022-11-11 10:30:55.826849997 -0800\r\nModify: 2022-11-11 10:30:52.989865945 -0800\r\nChange: 2022-11-11 10:30:52.989865945 -0800\r\n Birth: 2022-11-11 10:30:32.291982299 -0800\r\n$ sudo cp /root/wallpaper.jpg /opt/wallpapers/\r\n$ file /opt/wallpapers/wallpaper.jpg\r\n/opt/wallpapers/wallpaper.jpg: JPEG image data, JFIF standard 1.01, aspect ratio, density 218x218, segment length 16, Exif Standard: [TIFF image data, big-endian, direntries=7, orientation=upper-left, xresolution=98, yresolution=106, resolutionunit=2, software=Pixelmator Pro 2.1.3, datetime=2013:07:16 13:17:42], baseline, precision 8, 6016x3384, components 3\r\n$ stat /opt/wallpapers/wallpaper.jpg\r\n File: /opt/wallpapers/wallpaper.jpg\r\n Size: 2187975 Blocks: 4280 IO Block: 4096 regular file\r\nDevice: 253,1 Inode: 9752102 Links: 1\r\nAccess: (0644/-rw-r--r--) Uid: ( 0/ root) Gid: ( 0/ root)\r\nContext: unconfined_u:object_r:usr_t:s0\r\nAccess: 2022-11-11 10:31:06.320791009 -0800\r\nModify: 2022-11-11 10:30:52.989865945 -0800\r\nChange: 2022-11-11 10:30:52.989865945 -0800\r\n Birth: 2022-11-11 10:30:52.989865945 -0800\r\n```\r\n\r\n- Resize root LUKS partition on new fedora install.\r\n - https://www.golinuxcloud.com/resize-luks-partition-shrink-extend-decrypt/#Resize_LUKS_Partition\r\n\r\n```console\r\n$ df -h\r\nFilesystem Size Used Avail Use% Mounted on\r\ndevtmpfs 4.0M 0 4.0M 0% /dev\r\ntmpfs 7.8G 101M 7.7G 2% /dev/shm\r\ntmpfs 3.1G 1.9M 3.1G 1% /run\r\n/dev/mapper/fedora_fedora-root 15G 15G 754M 96% /\r\ntmpfs 7.8G 3.6M 7.8G 1% /tmp\r\n/dev/sdc3 1.1G 296M 751M 29% /boot\r\n/dev/sdc2 575M 6.2M 569M 2% /boot/efi\r\ntmpfs 1.6G 168K 1.6G 1% /run/user/1000\r\n$ sudo blkid -t TYPE=crypto_LUKS -o device\r\n/dev/sdc4\r\n$ lsblk\r\nNAME MAJ:MIN RM SIZE RO TYPE MOUNTPOINTS\r\nsdc 8:32 0 232.9G 0 disk \r\n\u251c\u2500sdc1 8:33 0 16M 0 part \r\n\u251c\u2500sdc2 8:34 0 576M 0 part /boot/efi\r\n\u251c\u2500sdc3 8:35 0 1G 0 part /boot\r\n\u2514\u2500sdc4 8:36 0 231.2G 0 part \r\n \u2514\u2500luks-18013279-e995-45bc-bcb8-83dda718da78 253:0 0 231.2G 0 crypt \r\n \u2514\u2500fedora_fedora-root 253:1 0 15G 0 lvm /\r\nzram0 252:0 0 8G 0 disk [SWAP]\r\n$ sudo cryptsetup status fedora_fedora-root\r\n/dev/mapper/fedora_fedora-root is active and is in use.\r\n type: n/a\r\n$ sudo cryptsetup status luks-18013279-e995-45bc-bcb8-83dda718da78\r\n/dev/mapper/luks-18013279-e995-45bc-bcb8-83dda718da78 is active and is in use.\r\n type: LUKS2\r\n cipher: aes-xts-plain64\r\n keysize: 512 bits\r\n key location: keyring\r\n device: /dev/sdc4\r\n sector size: 512\r\n offset: 32768 sectors\r\n size: 484860697 sectors\r\n mode: read/write\r\n flags: discards\r\n```\r\n\r\n- Reboot to live image of fedora server 36\r\n - Run `lvextend` and `xfs_growfs` on `/dev/mapper/fedora_fedora-root`, grow\r\n by unused space size, around +216.1G.\r\n\r\n```console\r\n$ lsblk\r\n$ cryptsetup luksOpen /dev/sdc4 luks\r\n$ cryptsetup status luks\r\n$ lvextend -L +216.1G /dev/mapper/fedora_fedora-root\r\n$ mount /dev/mapper/fedora_fedora-root /mnt\r\n$ xfs_growfs /dev/mapper/fedora_fedora-root\r\n```\r\n\r\n- Boot and check new disk space, 216G available.\r\n\r\n```console\r\n$ df -h\r\nFilesystem Size Used Avail Use% Mounted on\r\ndevtmpfs 4.0M 0 4.0M 0% /dev\r\ntmpfs 7.8G 93M 7.7G 2% /dev/shm\r\ntmpfs 3.1G 1.9M 3.1G 1% /run\r\n/dev/mapper/fedora_fedora-root 232G 16G 216G 7% /\r\ntmpfs 7.8G 3.5M 7.8G 1% /tmp\r\n/dev/sdc3 1.1G 296M 751M 29% /boot\r\n/dev/sdc2 575M 6.2M 569M 2% /boot/efi\r\ntmpfs 1.6G 168K 1.6G 1% /run/user/1000\r\n```\r\n\r\n- https://github.com/decentralized-identity/credential-manifest/blob/main/spec/spec.md\r\n - https://github.com/decentralized-identity/credential-manifest/pull/131/files#diff-c4795c497b83a8c03e33535caf0fb0e1512cecd8cb448f62467326277c152afeR379\r\n - https://github.com/decentralized-identity/credential-manifest/blob/main/spec/spec.md#credential-response\r\n - > // NOTE: VP, OIDC, DIDComm, or CHAPI outer wrapper properties would be at outer layer\r\n- https://github.com/decentralized-identity/credential-manifest/blob/main/test/credential-manifest/test.js\r\n- TODO\r\n - [x] Resize LUKS fedora root to use full SSD attached via USB 3.1 :P it's fast!\r\n - [ ] \"We need to consider automation too to make this work in the CI/CD pipeline. We use the open-source Data Flow Facilitator for Machine Learning (DFFML) framework to establish a bidirectional data bridge between the LTM and source code. When a new pull request is created, an audit-like scan is initiated to check to see if the LTM needs to be updated. For example, if a scan detects that new cryptography has been added to the code, but the existing LTM doesn\u2019t know about it, then a warning is triggered. Project teams can triage the issue to determine whether it is a false positive or not, just like source code scans.\" [John L Whiteman]\r\n - [Rolling Alice: Progress Report 6: Living Threat Models Are Better Than Dead Threat Models](https://gist.github.com/pdxjohnny/07b8c7b4a9e05579921aa3cc8aed4866#file-rolling_alice_progress_report_0006_living_threat_models_are_better_than_dead_threat_models-md)\r\n - [ ] Investigate https://github.com/BishopFox/sliver for comms"
}
]
},
{
"body": "# 2022-11-12 Engineering Logs",
"replies": [
{
"body": "## 2022-11-12 @pdxjohnny Engineering Logs\r\n\r\n- \ud83d\udefc security \ud83e\udd14\r\n - Twitter conversation with Dan resulted only in roller coaster bogie (boh-gee) lock idea.\r\n - Roller skate security play on words?\r\n - Roll\u2019r fast, roll\u2019r tight, roll\u2019r clean, secure rolling releases with Alice.\r\n - Content addressable with context aware caching\r\n - See recent \r\n - Minimal attack surface\r\n - See unikernel in thread\r\n - No vulns or policy violations\r\n - Development future aligned with principles strategic principles\r\n - *Gif of Alice on roller skates throwing a bowling ball which is a software vuln, strike, she frontflips throwing knife style throws the pins into pull requests. We zoom out and see her just doing this over and over again around the Entity Analysis Trinity. Intent/LTM is where the throwing board is. Bowling alley is static analysis and the end of the bowling ally where she frontflips over (through hoop of CI/CD fire?) is where she pics up the pins and throws them as pull request (titles and numbers maybe, pulls/1401 style maybe?) knives into the board at the top which is the LTM and codebase. Then from top, LTM to static analysis where bowling alley starts shes in the lab, cooking up the vuln or maybe out looking for it. Or maybe refactoring after pull requests!*\r\n- https://arstechnica.com/gadgets/2022/10/everything-we-know-about-the-white-houses-iot-security-labeling-effort/\r\n- https://github.com/shirayu/whispering\r\n - couldn\u2019t make it work\r\n\r\n```console\r\n$ sudo dnf install -y portaudio-devel\r\n$ pip install -U git+https://github.com/shirayu/whispering.git@v0.6.4\r\n$ whispering --language en --model medium\r\nUsing cache found in /home/pdxjohnny/.cache/torch/hub/snakers4_silero-vad_master\r\n[2022-11-14 07:23:58,140] cli.transcribe_from_mic:56 INFO -> Ready to transcribe\r\nAnalyzing/home/pdxjohnny/.local/lib/python3.10/site-packages/torch/nn/modules/module.py:1130: UserWarning: operator() profile_node %668 : int[] = prim::profile_ivalue(%666)\r\n does not have profile information (Triggered internally at ../torch/csrc/jit/codegen/cuda/graph_fuser.cpp:104.)\r\n return forward_call(*input, **kwargs)\r\n```\r\n\r\n```console\r\n$ set -x; for file in $(ls If*.m4a); do python -uc 'import sys, whisper; print(whisper.load_model(\"medium.en\").transcribe(sys.argv[-1])[\"text\"])' \"${file}\" 2>&1 | tee \"${file}.log\"; done\r\n```\r\n\r\n- TODO\r\n - [ ] https://github.com/CycloneDX/bom-examples/tree/master/OBOM/Example-1-Decoupled\r\n - this as system context inputs for validity check\r\n - [ ] VDR\r\n - [ ] VEX\r\n - Payload (system context, see did as service endpoint architecting alice streams) goes in `detail`\r\n - https://github.com/CycloneDX/bom-examples/blob/83248cbf7cf0d915acf0d50b12bac75b50ad9081/VEX/Use-Cases/Case-1/vex.json#L47"
},
{
"body": "# If You Give A Python A Computer\r\n\r\nMoved to: https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0002_shes_ariving_when.md\r\n\r\nIf you give a Python a computer, they're going to want to write a script. If they want to write a script, they're probably going to want to call another script. If they're going to call a script, they're going to want to read the output. If they read the output, they're going to want to write it somewhere else. So if they write the script, that's the first operation. If they read the output, now that gets into the importance of the... Okay. If they write a script, that becomes the operation. Now they want to execute another command. Okay. So that's another operation. Now, if... Now, reading the output. So... Now, reading the output, it comes in an event-based way. Because you need to wait for the return code, and you want to read line by line, and you want to do all that at the same time. Right. So you're going to end up with, you know, what amounts to the... execute some process, but being run in a dataflow will have to show that. And then, okay... pass that script. They're going to want to write it. Call another script. If they're going to call a script, they're going to want to read the output. Okay. If they're going to read the output, they're probably going to want to do something with the output. Or they're probably going to want to write it somewhere else. If they're going to want to write it somewhere else, that means that they need to use the network. If they're going to use the network, they should probably be using asyncio. Okay. So, now what happens after you've written it somewhere else? Okay. Probably running something remotely. Okay, what do you usually do? Yeah, you're going to want to do something remote. You're going to want to write it somewhere else. Okay, well, where are you going to... If you want to write it somewhere else, you probably want a web service to receive it. You probably want to write something to receive it. Yeah, you want to... If you're going to write it somewhere else, you probably need to write something to receive it somewhere else. Okay. And now that's the first time where we've got... The first operation is the script. It executes the subprocess, which is in the same machine, and then it wants to write it somewhere else. So now you can have the implementation of the script is on one machine, and now we can show how the input moves to the other machine using the execution environment. Okay, it's going to want to write something to receive it. Now, if you write something to receive... What is he going to want to do? He's probably going to want to run that on another computer. Okay. He's probably going to want to run it on another computer. And when he runs it on another computer, he's probably going to need to deploy... He's probably going to... If he wants to run it on another computer, then he's going to need to build it.\r\n\r\nHe's gonna want to run on our computer. He's going to want to build it. No, he's going to want to build it. He's going to build it. And then this is where we get into something where it's like, uh, synthesis. Where we can basically say, hey, so we're sending from... Okay, so basically we're running the script on one machine. We're sending to the other machine. So, the other machine, and we send it to the other machine, we're doing that via probably an operation implementation network, which exposes the open API. Or which hits its server, which exposes the open API. So then we need to go synthesize the server that builds the open API. So, the implementation is seen by the side. The implementation is seen by the side that runs the script is the open API client. Now, when you take the same data flow and you render it like you can take the same data, so you can be executing the data flow, or you can take the data flow and you could do like a build, essentially. And when you do the build, the implementation, you see, yeah, when you do the build, it's essentially, it's essentially using an orchestrator to do the build. Is it using an orchestrator to use the build? I think no, I think it might just be like a run data flow. And the run data flow handles putting things together. So it might see this operation that says, you know, what does the operation say? It says it's to receive, you know, receive client, receive something operation. Right. OK. And I really like it's the log, you know, it's the write to log file. OK, it's right to log file. Right. Write to file. No, update Web page. Update Web page. OK. And then we can see a Web page that just shows the updated value. All right. So. OK. And then we can just run the output and pull and refresh the Web page. OK, so. OK, so. OK, so now you're going to synthesize this thing. So how would you do that? Basically, ideally, you would describe it as a data flow or you would describe it. Maybe you describe it as a what you're going to describe as a data flow. So how do you describe it? So maybe your run data flow here is something like. Some kind of. You know, it's a it's a synthesis run data flow. Very cool. So it's some kind of synthesis run data flow instead of instead of actually it's it's. OK, so how are you swapping that out? Well, you're swapping out the operation implementation when you do the execution. So you swap out the operation. So you swap out. OK, well. So. Do the execution when you do the execution. So you have essentially have multiple. Multiple. OK, so you have multiple. You might actually select a different. So you have selected the operation implementation for you essentially have like a client and a server. And so somewhere in the data flow, you say for client. For client. Then choose the operation implementation network, like each operation instance has a preferred implementation network for each deployment method. And so when you synthesize your server, you say my preferred method is OK. You say my preferred method is essentially the synthesize thing. And then. Yeah, it's like a build. Your preferred method is actually build. And what even like does it even matter that you have the inputs there? No, probably not, because you're probably going to say you're probably going to say pass data flow to the build, which you're probably going to pass the data flow to the build, which will. You're going to pass the data flow to the build in the builds config, which means that you need to configs specific to deployment as well. And so you need configs that are specific to deployment as well. So. Can fix specific to. Yeah. OK, so then. So you can fix this specific to. So you need to config specific for build and I can fix specific for deploy. OK, so in the build.\r\n\r\nIn the build specific configs you have a data flow. In that data flow it probably contains, for example, say we were doing this. Say we wanted to build a fast API. We're going to build this fast API thing. We're actually going to synthesize one of the roots. We'll synthesize one of the roots. We'll output a Python file that runs the data flow where the input is one of those model things, and the model will take the inputs as shown from the parent data flow, whatever the inputs to the operation were. Basically, you run data flow with the build option set. With the build, your target is build. So you run data flow, your target is build. Now your operation implementation knows what its inputs are. It's going to take those inputs and their definitions. Because you're saying, I am a builder, you're probably going to inject the inputs to your own. You're probably going to take the operation that you're building for and you're going to add it as an input to the network itself, like the operation itself, so that then the data flow that does the build would say, because you're basically saying the build, you're executing run data flow. On the client, you're going to end up with an operation implementation which calls the open API spec, like the open API spec endpoint. You're going to end up with an operation implementation that calls the open API spec. When you do the build, the build says, like server build for example, you would pass server build says, prefer an operation implementation. When you run data flow server build, the other one is a NOP. Essentially, you NOP the client stuff. You have NOPs, the client specific implementations are NOPs. The client specific implementations are NOPs and you end up doing actually run data flow without any inputs. It's kicked off whenever the data flow went on initialization, whatever that flow was, whatever that code path was through the orchestrator. It kicks off the operations that don't have any inputs. It'll kick off this operation because this operation is actually run data flow and the original one that was running the script is actually a NOP in the server build version. It's run data flow and the script is NOP. Now we need to build, but it's run data flow. If we run data flow, we're going to say add the operation itself as the... We might need a specific version of run data flow for this because I don't know if this is something that we would add in the config to run data flow. It seems a little bit specific to a build process type of run, but we might be a separate operation is what I mean. Basically, what you end up is not really an implementation specific over preference. I think that probably comes somewhere else. You probably have an operation implementation preference for each operation at run time where you would prefer... You have two things. You basically have deployment specific overrides. You have a deployment specific override and then you have a at execution time deployment implementation preference per deployment. You run the build. It adds the input to the network and it specifies and you've given it the data flow. The data flow you've given it says write me a... Write out a file that is an open API server or a fast API server. It writes out the file that's a fast API server. It uses the input definitions to create the model and the result is a built fast API app. Now you have the deploy phase and then you might take that and you might turn it into a container. Now you would have the deploy data flow. You would run the deploy step on the same data flow and you would say... You would run the deploy step on the same data flow and it would then take the built application and you would run the deploy phase on the same data flow and it would take the built application. Then if you give up Python... If he wanted to write the... If he wanted to read the logs then he wanted to write the logs. If he wanted to write the logs he's probably going to want to write them to his server. If he wants to write the logs that's where we say the part about AsyncIO. If he wants to write them to his server then now we need to figure out, okay, how is he going to write his server? What is his server? That's where we get into the synthesis and the build version of the data flow. Now if he's going to want to write the summary he's probably going to need a server. If he's going to need a server he's going to write a service... Yeah, he's going to need a service. If he's going to write a service he's going to need to deploy a service. Now we get into deployment. Now we need to think somehow about the artifacts that were created in the build process. How do we communicate the fact that there are outputs from one stage? Because it almost becomes... It is a different data flow really. Where are we picking up those outputs? That stuff is probably all in config. We probably have... Yeah, so we've probably configured... We've probably configured... Yeah, that stuff is all in config. For example, those data flows, the build data flow, the one that we're supplying to the run data flow when we override for the build phase, which means configs needs to have an option for deployment specific stuff. When we do that for build phase we're going to write out... The data flow will take in its configuration, the places where it writes things. Then the deployment data flow will just configure with the same credentials or the same whatever or the same output path so that it understands. We're not facilitating... Do we need to facilitate that? If you wanted to do that you would write one data flow that would call both of them and then pass the outputs between them. Yeah, you could have a data flow that does build and deploy. You could run the build stage and you can run the deploy stage or you could have a build and deploy data flow. The build and deploy data flow would say, okay, run the build data flow. When you run it... Let's see. When you run the build data flow you need to tell it where the hell you're building, where the hell you're going to... You need to configure it. Does that need to be configured or inputs? Because most of it is inputs are runtime things. Configuration is locked at data flow. I would say that you can override that data flow. For example, you wanted to build this server and it comes out of the container. Now I want to push that container somewhere. You built it and now you want to push it somewhere. When you push it somewhere you do the build. Say you do the build and it's entirely in memory somehow. Then you push an in memory image as an input to an operation which does something to it. It's probably going to push it to a registry. You could potentially just swap out that operation. In that case the registry is probably helping us configure it. Remember we can take anything that's configured and we can make it an input and we can take anything that's an input and make it configured if you wanted to. You could have re-scoping operations. It's essentially that little operation that we talked about that has... You could wrap any input. You could wrap any operation and make the scope different on this. Okay. Now deployment artifacts. Build artifacts, deployment artifacts. The build, where do you separate that? Is the build build and push? Is the build just build? Okay, if it's just build then yeah, you end up with this image and you're like, what do I do with the image? You probably need to push it somewhere. From that perspective you need to have an operation in the data flow that's going to do that push somewhere. Now how do you communicate where it was pushed to the other thing? Well when you run that data flow you either need to have configured the operations or you need to be passing them as inputs. That's really up to you. You can... Yeah. If you configure them then you can always wipe them out with an override and make them configurable. Make them be in operations that you used to take it as a config but you're overriding it to take it as an input. Now that you have that, okay so you've built and pushed then you run the deploy. The deploy, you have a data flow that's just run data flow stage build, run data flow stage deploy and then that would be built and deployed. If you give a Python, if he wants to write a service he's going to want to deploy a service. If he's going to want to deploy a service then it's the same flow as the build. You just show him it again. Now if he's going to want to deploy a service he's going to want to deploy a service from his CI CD. If he's going to want CI CD and then what do we go into the whole build server process? I'm not sure. Maybe.\r\n\r\nAnd if he wants to deploy a service, he's going to want some logs. And if he wants some logs, Oh, wait, no, we can't go yet. We have to, we have to finish out. If he's going to deploy a service, he's going to want some logs. Okay. And then we talk about, and then we talk about the integration with the open lineage stuff. We can talk about the integration with the open lineage stuff for John Lader who can't hear himself than Apple.\r\n\r\nScratch the logs. Alright, well, in that order. So, if he's gonna write a service, he's going to have to configure it. Alright, if he's gonna, if he's gonna deploy, oh, if he's gonna deploy a service, he's going to need some secrets. Okay, and now we talk about the whole secret thing and the input transforms and yeah, that whole thing. We'll talk about that whole thing. And, okay, yeah, it's gonna, and if he's going to, and if he's going to manage his secrets, he's going to need to do his security process. If he's going to do his security process, okay, and when he does his security process, here's the level of audibility, the auditability and the visibility in throughout the entire thing in an automated way. And if, okay, and if he's going to do his security process, then he's going to need, then he's going to need some logs. And if he's going to need some logs, then we do the whole open lineage thing, right. And if he's gonna have some logs, and if he's gonna have some logs, then he's gonna look for bugs. Okay, maybe he's gonna look for bugs. What is he gonna do with the logs? Okay, he's gonna look for bugs, he's gonna look for bugs and logs, he's going to, I don't know, probably looking for bugs. So, okay, but how do we get into the thing where you have the real-time updates throughout the entire thing? So, okay, the bugs, okay, the bugs, and if he's gonna look for logs, okay, so security and then he's got logs. And so the logs, then the logs, then the logs, we get into the open lineage thing. Yeah, we get into the open lineage thing and we can look at the data flow as it's running and we can do analysis on and, you know, what is happening as it's happening. And we can potentially even report that information all the way back through the client. Have we covered everything? I think we have. Perfect.\r\n\r\nOkay, and then, and if you're gonna fix some bugs, so if you're gonna find some bugs, you're gonna fix some bugs. If you're gonna fix some bugs, you're gonna want some CI CD. And if you want some CI CD, then blah blah blah blah blah, then we tell the story about kube control, fucking etc. And I think we have a wrap all the way back in the whole circle of development. I think we've covered every single part, unless we have not. What else might we need to cover? So we covered building the app, deploying the app, across platforms, running it across platforms, events, logging, bugs, bug fixing, security, fuck man. Alright, okay.\r\n\r\n\r\nSo, if you synthesize data flow, you may lose things like event emissions of inputs between operations. So we need a way to say that, we need a way, we need that way to say what events, events, what events are you expecting? The data flow should declare what events it's expecting to yield as an allow list.\r\n\r\nAdded (2022-11-14): If you give Monty Python a computer, they\u2019ll want to search for the Holy Grail. If they want to search for the Holy Grail, they might find the system context. If they find the system context, they\u2019ll know that the Holy Grail is the Trinity is the system context: the upstream, the overlay, and the orchestrator. ;)"
}
]
},
{
"body": "# 2022-11-13 Engineering Logs",
"replies": [
{
"body": "## 2022-11-13 @pdxjohnny Engineering Logs\r\n\r\n> - The following mermaid diagram became: https://github.com/intel/dffml/commit/fbcbc86b5c52932bccf4cd6321f4e79f60ad3023\r\n> - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0002_shes_ariving_when.md#system-context\r\n> - Original: ![2022-11-13-Alice-ASAP-System-Context-Sketch](https://user-images.githubusercontent.com/5950433/201754772-0b326492-69ea-4518-90be-6a850d960688.jpeg)\r\n\r\n```mermaid\r\ngraph TD\r\n subgraph system_context[System Context]\r\n upstream[Upstream]\r\n overlay[Overlay]\r\n orchestrator[Upstream]\r\n end\r\n```\r\n\r\n- Theres a poets beach poem that goes with this.\r\n - \u201ctimeless\u201d, the one from Athena/Minerva\r\n\r\n![E35628A2-B9F3-4A29-88C8-F773A7A9F9C9](https://user-images.githubusercontent.com/5950433/201529807-c7e63b48-6f41-4686-98be-bb73484df83f.jpeg)\r\n\r\n"
}
]
},
{
"body": "# 2022-11-14 Engineering Logs",
"replies": [
{
"body": "## 2022-11-14 @pdxjohnny Engineering Logs\r\n\r\n- https://qwik.builder.io/docs/getting-started/ \r\n - Serialization of cached flow via overlay to inputs to qwik cache resume\r\n - https://qwik.builder.io/docs/concepts/resumable/\r\n - https://qwik.builder.io/docs/advanced/qrl/\r\n- https://www.intel.com/content/www/us/en/newsroom/news/intel-introduces-real-time-deepfake-detector.html#gs.isnpod\r\n - ActivityPub (mastodon) \u201cfollow\u201d post metrics / SCITT receipt of analysis if video is deepfake as reply.\r\n- Architecting Alice: An Image: ActivityPub posts with YAML body content and image attached with post quantum jwk or scitt receipt or maybe content address of scitt reciept?\r\n- https://twitter.com/pippellia/status/1592184568345509888\r\n - Central planning and chaos\r\n - This is why we focus in equilibrium\r\n- https://arxiv.org/abs/2211.01724\r\n - > We formulate learning for control as an inverse problem -- inverting a dynamical system to give the actions which yield desired behavior. The key challenge in this formulation is a distribution shift -- the learning agent only observes the forward mapping (its actions' consequences) on trajectories that it can execute, yet must learn the inverse mapping for inputs-outputs that correspond to a different, desired behavior. We propose a general recipe for inverse problems with a distribution shift that we term iterative inversion -- learn the inverse mapping under the current input distribution (policy), then use it on the desired output samples to obtain new inputs, and repeat. As we show, iterative inversion can converge to the desired inverse mapping, but under rather strict conditions on the mapping itself.\r\n >\r\n > We next apply iterative inversion to learn control. Our input is a set of demonstrations of desired behavior, given as video embeddings of trajectories, and our method iteratively learns to imitate trajectories generated by the current policy, perturbed by random exploration noise. We find that constantly adding the demonstrated trajectory embeddings as input to the policy when generating trajectories to imitate, a-la iterative inversion, steers the learning towards the desired trajectory distribution. To the best of our knowledge, this is the first exploration of learning control from the viewpoint of inverse problems, and our main advantage is simplicity -- we do not require rewards, and only employ supervised learning, which easily scales to state-of-the-art trajectory embedding techniques and policy representations. With a VQ-VAE embedding, and a transformer-based policy, we demonstrate non-trivial continuous control on several tasks. We also report improved performance on imitating diverse behaviors compared to reward based methods.\r\n- Search compressed asciinemas recordings\r\n\r\n```console\r\n$ (for file in $(ls ~/asciinema); do unxz -d - < ~/asciinema/$file; done) | grep -i /1421\r\n```\r\n\r\n- [Mark Foster\u2019s Linked Data User Experience Notes](https://docs.google.com/document/d/17n8hfdPfqfpbPj4ss-ep4nCkpp9ZBoy6U2Q1t7j-knI/edit)\r\n - https://futureinternet.io\r\n - https://twitter.com/mfosterio/status/1591580950752002048\r\n - > I\u2019ve been looking for ways to access toots in JSON-LD Activity Streams I can return my profile by passing the header Accept application/ld+json on https://mas.to/@mfoster/ but my toots are in JSON https://mas.to/api/v1/accounts/109254208668258721/statuses\r\n - I haven\u2019t been following Mark for long (3 months? However he seems extremely capable, everyone in SCITT, everyone in GUAC for sure, and of course Changuard folks, Dan, starts with an A? The main wolfi maintainer) know whats up and the DIF crew as the leaders) if he is playing with the same shit we were thinking with activitypub and current federation technology as a bridge / base to build up and integrate infra full decentralization\r\n - ^ Strategic mapping (wardly maps) of train of thought (supply chain security) activity (life) open/internal implementation/spec research / definition see early videos explainers on doing depth of field mapping state of the art mapping see recent threat model example on avoiding engagement with unaligned research communities.\r\n - Mastodon SCITT review on data provenance (attached as knowledge graph link, reply, source: github docs? Content exact match? Add reply with SCITT recpit as body, integrate into mastodon to show these types of replys integrated into UI HTTPS CA view from browser style check with detail expand, but in html, just as a ui example, parse out the fields and display them nice\r\n- https://datatracker.ietf.org/doc/html/draft-birkholz-scitt-architecture-02#section-7\r\n- \u201cMaybe it\u2019s a dream?\u201d Sequence - 2022-09-21\r\n- https://mermaid-js.github.io/mermaid-live-editor/\r\n\r\n```mermaid\r\nsequenceDiagram\r\n BobSCITT->>+Bob: Generate did:pkg:bobsoftware serialized federated ledger claim / recepit\r\n Alice->>+AliceSCITT: Generate did:oa:merkleofshouldicontribute serialized federated ledger claim / recepit\r\n```\r\n\r\n- Cross referencing is fun\r\n - Graphs are fun\r\n - https://en.wikipedia.org/wiki/Knowledge_graph\r\n- Unfortunately GitHub reworks the links which include the `#discussioncomment-4131964` part in them on display and results in jumping to the top of the thread.\r\n\r\n![image](https://user-images.githubusercontent.com/5950433/201763045-e69ce8b2-df40-487a-8b91-bb28691889c2.png)\r\n\r\n- Podman oddities\r\n- No time for SELinux policies currently but we should integrate in the future\r\n (`JobKubernetesOrchestrator`?)\r\n - https://github.com/containers/udica#creating-selinux-policy-for-container\r\n\r\n```console\r\n$ sudo setenforce 0\r\n$ sudo dnf install -y aardvark-dns podman-compose\r\n```\r\n\r\n- Spinning up mastodon\r\n - What do you call N instances of Alice communicated via the\r\n Thought Communication Protocol?\r\n - A Mastodon server full of toots\r\n- References\r\n - https://github.com/containers/podman-compose\r\n - https://github.com/mastodon/mastodon\r\n - https://docs.joinmastodon.org/admin/setup/\r\n - https://github.com/mastodon/mastodon/commit/b17202ca0f19b83beb25afdba7e713a0f9329ffa\r\n- If `podman-compose` asks which registry for images choose `docker.io`\r\n- Getting name resolution failures (DNS)\r\n - Fixed by installing aardvark-dns\r\n- Ruby projects usually have an initial database population\r\n - This must be done on first load to preform database \"migrations\", setting up the DB.\r\n - `FATAL: role \"mastodon\" does not exist`\r\n - https://github.com/mastodon/mastodon/issues/18113\r\n - https://github.com/mastodon/mastodon/pull/16947\r\n - `FATAL: database \"mastodon_production\" does not exist`\r\n - https://hub.docker.com/_/postgres\r\n - > `POSTGRES_DB`\r\n >\r\n > This optional environment variable can be used to define a different name for the default database that is created when the image is first started. If it is not specified, then the value of `POSTGRES_USER` will be used.\r\n- On `podman-compose up` it still complains\r\n - `2022-11-15 05:41:58.177 UTC [90] FATAL: database \"mastodon_production\" does not exist`\r\n - `2022-11-15 05:42:02.256 UTC [91] FATAL: role \"mastodon\" does not exist`\r\n\r\n```console\r\n$ git clone https://github.com/mastodon/mastodon\r\n$ cd mastodon\r\n$ git checkout v4.0.2\r\n$ git log\r\ncommit 03b0f3ac83edfc46d304bfca1539ca6000e36fc3 (HEAD, tag: v4.0.2, main)\r\nAuthor: Eugen Rochko <eugen@zeonfederated.com>\r\nDate: Tue Nov 15 03:57:18 2022 +0100\r\n\r\n Bump version to 4.0.2 (#20725)\r\n$ podman-compose run web bundle rake mastodon:webpush:generate_vapid_key\r\nVAPID_PRIVATE_KEY=djDWtpmK3CD9SUu_UedWOyOGBA-Fg5r5MWiXVhZHZbo=\r\nVAPID_PUBLIC_KEY=BOVhs2nJ4MpjdaHAVu7UdlPlNjzMX2pKFyKgOxvYO7LX8eh_H3TA_O_Ebc2asJPhDoqImE-3Xz0BmaeM_EucIr0=\r\n$ podman-compose run web bundle rake secret\r\n6ece0cfc0772308479f5cd6155cfc282defab20307a185b399dd6cf2f9b4dc3a81691406c368905c64ccafa56e05473371dccb3b948001369b18be57cfefa9f4\r\n$ podman-compose run web bundle rake secret\r\ne2fdd51aef896d5c8c647dbbf6b77426d3df59a2817181738afc0ae8ab9e34a413ac5f21ef9aed41f38260075ff6a327f29e717f03c66296dfc0838402851714\r\n$ cat > .env.production <<'EOF'\r\n# This is a sample configuration file. You can generate your configuration\r\n# with the `rake mastodon:setup` interactive setup wizard, but to customize\r\n# your setup even further, you'll need to edit it manually. This sample does\r\n# not demonstrate all available configuration options. Please look at\r\n# https://docs.joinmastodon.org/admin/config/ for the full documentation.\r\n\r\n# Note that this file accepts slightly different syntax depending on whether\r\n# you are using `docker-compose` or not. In particular, if you use\r\n# `docker-compose`, the value of each declared variable will be taken verbatim,\r\n# including surrounding quotes.\r\n# See: https://github.com/mastodon/mastodon/issues/16895\r\n\r\n# Federation\r\n# ----------\r\n# This identifies your server and cannot be changed safely later\r\n# ----------\r\nLOCAL_DOMAIN=example.com\r\n\r\n# Redis\r\n# -----\r\n# REDIS_HOST=localhost\r\nREDIS_HOST=redis\r\nREDIS_PORT=6379\r\n\r\n# PostgreSQL\r\n# ----------\r\n# DB_HOST=/var/run/postgresql\r\nDB_HOST=db\r\nDB_USER=mastodon\r\nDB_NAME=mastodon_production\r\nDB_PASS=mastodon\r\nDB_PORT=5432\r\n\r\n# Elasticsearch (optional)\r\n# ------------------------\r\n# ES_ENABLED=true\r\n# ES_HOST=localhost\r\n# ES_PORT=9200\r\n# Authentication for ES (optional)\r\n# ES_USER=elastic\r\n# ES_PASS=password\r\n\r\n# Secrets\r\n# -------\r\n# Make sure to use `podman-compose run web bundle rake secret` to generate secrets\r\n# -------\r\nSECRET_KEY_BASE=6ece0cfc0772308479f5cd6155cfc282defab20307a185b399dd6cf2f9b4dc3a81691406c368905c64ccafa56e05473371dccb3b948001369b18be57cfefa9f4\r\nOTP_SECRET=e2fdd51aef896d5c8c647dbbf6b77426d3df59a2817181738afc0ae8ab9e34a413ac5f21ef9aed41f38260075ff6a327f29e717f03c66296dfc0838402851714\r\n\r\n# Web Push\r\n# --------\r\n# Generate with `podman-compose run web bundle rake mastodon:webpush:generate_vapid_key`\r\n# --------\r\nVAPID_PRIVATE_KEY=djDWtpmK3CD9SUu_UedWOyOGBA-Fg5r5MWiXVhZHZbo=\r\nVAPID_PUBLIC_KEY=BOVhs2nJ4MpjdaHAVu7UdlPlNjzMX2pKFyKgOxvYO7LX8eh_H3TA_O_Ebc2asJPhDoqImE-3Xz0BmaeM_EucIr0=\r\n\r\n# Sending mail\r\n# ------------\r\n# SMTP_SERVER=smtp.mailgun.org\r\n# SMTP_PORT=587\r\n# SMTP_LOGIN=\r\n# SMTP_PASSWORD=\r\n# SMTP_FROM_ADDRESS=notifications@example.com\r\n\r\n# File storage (optional)\r\n# -----------------------\r\n# S3_ENABLED=true\r\n# S3_BUCKET=files.example.com\r\n# AWS_ACCESS_KEY_ID=\r\n# AWS_SECRET_ACCESS_KEY=\r\n# S3_ALIAS_HOST=files.example.com\r\n\r\n# IP and session retention\r\n# -----------------------\r\n# Make sure to modify the scheduling of ip_cleanup_scheduler in config/sidekiq.yml\r\n# to be less than daily if you lower IP_RETENTION_PERIOD below two days (172800).\r\n# -----------------------\r\nIP_RETENTION_PERIOD=31556952\r\nSESSION_RETENTION_PERIOD=31556952\r\nEOF\r\n$ head -n 16 docker-compose.yml \r\nversion: '3'\r\nservices:\r\n db:\r\n restart: always\r\n image: postgres:14-alpine\r\n shm_size: 256mb\r\n networks:\r\n - internal_network\r\n healthcheck:\r\n test: ['CMD', 'pg_isready', '-U', 'postgres']\r\n volumes:\r\n - ./postgres14:/var/lib/postgresql/data\r\n environment:\r\n - 'POSTGRES_DB=mastodon_production'\r\n - 'POSTGRES_USER=mastodon'\r\n - 'POSTGRES_PASSWORD=mastodon'\r\n$ podman-compose down\r\n$ sudo rm -rf postgres14/\r\n$ time podman-compose run web bundle exec rake mastodon:setup\r\n$ podman-compose up\r\npodman start -a mastodon_db_1\r\npodman start -a mastodon_redis_1\r\npodman start -a mastodon_web_1\r\npodman start -a mastodon_streaming_1\r\npodman start -a mastodon_sidekiq_1\r\nWARN Starting streaming API server master with 3 workers \r\n=> Booting Puma\r\n=> Rails 6.1.7 application starting in production \r\n=> Run `bin/rails server --help` for more startup options\r\nWARN Starting worker 3 \r\nWARN Starting worker 2 \r\nWARN Worker 3 now listening on 0.0.0.0:4000 \r\nWARN Worker 2 now listening on 0.0.0.0:4000 \r\nWARN Starting worker 1 \r\nWARN Worker 1 now listening on 0.0.0.0:4000 \r\n2022-11-15T05:55:05.712Z pid=2 tid=53y WARN: `config.options[:key] = value` is deprecated, use `config[:key] = value`: [\"/opt/mastodon/lib/mastodon/redis_config.rb:38:in `<top (required)>'\", \"/opt/mastodon/config/application.rb:53:in `require_relative'\"]\r\n2022-11-15T05:55:06.117Z pid=2 tid=53y INFO: Booting Sidekiq 6.5.7 with Sidekiq::RedisConnection::RedisAdapter options {:driver=>:hiredis, :url=>\"redis://redis:6379/0\", :namespace=>nil}\r\n[4] Puma starting in cluster mode...\r\n[4] * Puma version: 5.6.5 (ruby 3.0.4-p208) (\"Birdie's Version\")\r\n[4] * Min threads: 5\r\n[4] * Max threads: 5\r\n[4] * Environment: production\r\n[4] * Master PID: 4\r\n[4] * Workers: 2\r\n[4] * Restarts: (\u2714) hot (\u2716) phased\r\n[4] * Preloading application\r\n[4] * Listening on http://0.0.0.0:3000\r\n[4] Use Ctrl-C to stop\r\n[4] - Worker 0 (PID: 10) booted in 0.01s, phase: 0\r\n[4] - Worker 1 (PID: 11) booted in 0.0s, phase: 0\r\n2022-11-15 05:55:07.954 UTC [233] FATAL: role \"postgres\" does not exist\r\n2022-11-15T05:55:09.222Z pid=2 tid=53y INFO: Booted Rails 6.1.7 application in production environment\r\n2022-11-15T05:55:09.222Z pid=2 tid=53y INFO: Running in ruby 3.0.4p208 (2022-04-12 revision 3fa771dded) [x86_64-linux]\r\n2022-11-15T05:55:09.222Z pid=2 tid=53y INFO: See LICENSE and the LGPL-3.0 for licensing details.\r\n2022-11-15T05:55:09.222Z pid=2 tid=53y INFO: Upgrade to Sidekiq Pro for more features and support: https://sidekiq.org\r\n2022-11-15T05:55:09.227Z pid=2 tid=53y INFO: Loading Schedule\r\n2022-11-15T05:55:09.227Z pid=2 tid=53y INFO: Scheduling scheduled_statuses_scheduler {\"every\"=>\"5m\", \"class\"=>\"Scheduler::ScheduledStatusesScheduler\", \"queue\"=>\"scheduler\"}\r\n2022-11-15T05:55:09.228Z pid=2 tid=53y INFO: Scheduling trends_refresh_scheduler {\"every\"=>\"5m\", \"class\"=>\"Scheduler::Trends::RefreshScheduler\", \"queue\"=>\"scheduler\"}\r\n2022-11-15T05:55:09.231Z pid=2 tid=53y INFO: Scheduling trends_review_notifications_scheduler {\"every\"=>\"6h\", \"class\"=>\"Scheduler::Trends::ReviewNotificationsScheduler\", \"queue\"=>\"scheduler\"}\r\n2022-11-15T05:55:09.232Z pid=2 tid=53y INFO: Scheduling indexing_scheduler {\"every\"=>\"5m\", \"class\"=>\"Scheduler::IndexingScheduler\", \"queue\"=>\"scheduler\"}\r\n2022-11-15T05:55:09.234Z pid=2 tid=53y INFO: Scheduling vacuum_scheduler {\"cron\"=>\"59 5 * * *\", \"class\"=>\"Scheduler::VacuumScheduler\", \"queue\"=>\"scheduler\"}\r\n2022-11-15T05:55:09.237Z pid=2 tid=53y INFO: Scheduling follow_recommendations_scheduler {\"cron\"=>\"44 8 * * *\", \"class\"=>\"Scheduler::FollowRecommendationsScheduler\", \"queue\"=>\"scheduler\"}\r\n2022-11-15T05:55:09.239Z pid=2 tid=53y INFO: Scheduling user_cleanup_scheduler {\"cron\"=>\"2 5 * * *\", \"class\"=>\"Scheduler::UserCleanupScheduler\", \"queue\"=>\"scheduler\"}\r\n2022-11-15T05:55:09.240Z pid=2 tid=53y INFO: Scheduling ip_cleanup_scheduler {\"cron\"=>\"13 4 * * *\", \"class\"=>\"Scheduler::IpCleanupScheduler\", \"queue\"=>\"scheduler\"}\r\n2022-11-15T05:55:09.242Z pid=2 tid=53y INFO: Scheduling pghero_scheduler {\"cron\"=>\"0 0 * * *\", \"class\"=>\"Scheduler::PgheroScheduler\", \"queue\"=>\"scheduler\"}\r\n2022-11-15T05:55:09.245Z pid=2 tid=53y INFO: Scheduling instance_refresh_scheduler {\"cron\"=>\"0 * * * *\", \"class\"=>\"Scheduler::InstanceRefreshScheduler\", \"queue\"=>\"scheduler\"}\r\n2022-11-15T05:55:09.247Z pid=2 tid=53y INFO: Scheduling accounts_statuses_cleanup_scheduler {\"interval\"=>\"1 minute\", \"class\"=>\"Scheduler::AccountsStatusesCleanupScheduler\", \"queue\"=>\"scheduler\"}\r\n2022-11-15T05:55:09.248Z pid=2 tid=53y INFO: Scheduling suspended_user_cleanup_scheduler {\"interval\"=>\"1 minute\", \"class\"=>\"Scheduler::SuspendedUserCleanupScheduler\", \"queue\"=>\"scheduler\"}\r\n2022-11-15T05:55:09.249Z pid=2 tid=53y INFO: Schedules Loaded\r\n2022-11-15T05:55:09.255Z pid=2 tid=53y uniquejobs=upgrade_locks INFO: Already upgraded to 7.1.27\r\n2022-11-15T05:55:09.256Z pid=2 tid=53y uniquejobs=reaper INFO: Starting Reaper\r\n2022-11-15T05:55:09.262Z pid=2 tid=2dsy uniquejobs=reaper INFO: Nothing to delete; exiting.\r\n2022-11-15T05:55:09.265Z pid=2 tid=2dsy uniquejobs=reaper INFO: Nothing to delete; exiting.\r\n[09ee11d4-25e1-4330-9f65-b642ae6a3732] Chewy request strategy is `mastodon`\r\n[09ee11d4-25e1-4330-9f65-b642ae6a3732] method=HEAD path=/health format=*/* controller=HealthController action=show status=200 duration=2.07 view=1.45\r\n2022-11-15 05:55:38.155 UTC [288] FATAL: role \"postgres\" does not exist\r\n[ActionDispatch::HostAuthorization::DefaultResponseApp] Blocked host: 0.0.0.0\r\n[ActionDispatch::HostAuthorization::DefaultResponseApp] Blocked host: localhost\r\nERR! fc8ec631-1ade-4713-a8c8-6125ba6cf87c Error: Access token does not cover required scopes\r\nERR! 17f0501f-de79-45f8-93cb-e5b8bb7178f7 Error: Access token does not cover required scopes\r\n[40016351-367b-4d43-be62-e2340fde46de] method=HEAD path=/health format=*/* controller=HealthController action=show status=200 duration=0.26 view=0.13\r\n[ActionDispatch::HostAuthorization::DefaultResponseApp] Blocked host: localhost\r\n2022-11-15 05:56:08.911 UTC [346] FATAL: role \"postgres\" does not exist\r\n2022-11-15T05:56:09.297Z pid=2 tid=2dv6 INFO: queueing Scheduler::AccountsStatusesCleanupScheduler (accounts_statuses_cleanup_scheduler)\r\n2022-11-15T05:56:09.301Z pid=2 tid=2dvq class=Scheduler::AccountsStatusesCleanupScheduler jid=030c3bd88689321e9097003a INFO: start\r\n2022-11-15T05:56:09.304Z pid=2 tid=2dyi INFO: queueing Scheduler::SuspendedUserCleanupScheduler (suspended_user_cleanup_scheduler)\r\n2022-11-15T05:56:09.306Z pid=2 tid=2dz2 class=Scheduler::SuspendedUserCleanupScheduler jid=03293e9712b7020c368c02bc INFO: start\r\n2022-11-15T05:56:09.341Z pid=2 tid=2dvq class=Scheduler::AccountsStatusesCleanupScheduler jid=030c3bd88689321e9097003a elapsed=0.04 INFO: done\r\n2022-11-15T05:56:09.356Z pid=2 tid=2dz2 class=Scheduler::SuspendedUserCleanupScheduler jid=03293e9712b7020c368c02bc elapsed=0.051 INFO: done\r\n$ curl -v http://localhost:3000/\r\n* Trying 127.0.0.1:3000...\r\n* Connected to localhost (127.0.0.1) port 3000 (#0)\r\n> GET / HTTP/1.1\r\n> Host: localhost:3000\r\n> User-Agent: curl/7.85.0\r\n> Accept: */*\r\n> \r\n* Mark bundle as not supporting multiuse\r\n< HTTP/1.1 403 Forbidden\r\n< Content-Type: text/html; charset=UTF-8\r\n< Content-Length: 0\r\n< \r\n* Connection #0 to host localhost left intact\r\n```\r\n\r\n- TODO\r\n - [ ] SCITT help make no new HTTP headers, SCITT as DID method? SCITT via ATP probably. Prototype as Data Repository.\r\n - [x] [SCITT API Emulator Bring Up](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4110695)\r\n - [ ] ActivityPub (Mastodon) bring up\r\n - [ ] Spin up and configure Atuin client / server https://github.com/ellie/atuin/blob/main/docs/server.md\r\n - [x] Update `Architecting Alice: She's Arriving When?` to include a start at some content\r\n we'd planned and drafted here and there related to the system context.\r\n - [docs: tutorials: rolling alice: architecting alice: she's arriving when?: Mermaid diagram for pattern with stream of consciousness and SCITT](https://github.com/intel/dffml/commit/fbcbc86b5c52932bccf4cd6321f4e79f60ad3023)\r\n - In this we only implement in memory and serialized SCITT for a\r\n single entity, Alice, no Bob yet. In `Architecting Alice: Stream of Consciousness`,\r\n we implement Alice and Bob comms on top of SBOM, VEX, VDR.\r\n - [ ] Ping https://github.com/ipvm-wg/spec/pull/8/files with She's Arriving When? and\r\n Our Open Source Guide to illustrate dataflow and provenance.\r\n - [ ] Explain how [https://gist.github.com/pdxjohnny/57b049c284e58f51d0a0d35d05d03d4a](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4141183) hopes to illustrate chaining effects.\r\n - [ ] Update `Architecting Alice: Stream of Consciousness` to include notes on\r\n building off of `Architecting Alice: She's Arriving When?` to now communicate\r\n between Alice and Bob via SBOM, VEX, VDR, etc. rolled in.\r\n - First just do simple openssl demo where package is distributed as binary wheel\r\n via static pypi, sbom deployment.\r\n - https://github.com/tpm2-software/tpm2-pytss (this should ldd to openssl)\r\n - Tracking via: https://github.com/intel/dffml/issues/1421\r\n - https://github.com/CycloneDX/cyclonedx-bom-exchange-api\r\n - [ ] Alice CLI command to start working an issue\r\n - `alice please context switch -overlay log_work_to_github_issue https://github.com/intel/dffml/issues/1421`\r\n - Pretty print issue body\r\n - Start logging work to thread\r\n - [ ] Check later today for movement on https://github.com/decentralized-identity/credential-manifest/issues/125#issuecomment-1310728595\r\n - [ ] Simple `python -m http.server --cgi ` based implementation of an upload server\r\n - Ideally this updates the directory structure of a static PyPi registry (future: OCI image registry)\r\n - Require SCITT recit with manifest of artifact sha and OIDC token\r\n - We can self issue to start\r\n - [ ] Reach out to devs of https://githubnext.com/projects/ai-for-pull-requests/ about abstraction layer / intermediate representation.\r\n - [ ] Mastodon / ActivityPub as Intentory (see PR) ala meta package repo / stream of consciousness/ release notification\u2019s and metadata (our ATProto precursor)\r\n - [ ] Figure out how to do periodic follow on scanning with CVE-Bin-Tool\r\n - Could just be ensureing there are github actions workflows on schedule to scan\r\n - https://github.com/intel/dffml/blob/alice/docs/arch/alice/discussion/0023/reply_0022.md\r\n - > Create first distinct ancestor of Alice by creating ongoing validation flows to re check cves when new vulns come in. Show how tjisbis trival by adding those contexts to the chain whoch are picked ip and executed by agents. Agents just look for any contexts that have been issused but not executed. Prioritizer also prioritizes \u201creminder threads whoch remind prioritizater to re broadcast train of thought on periodic cycle if not scheduled for execution with frequency based on priority. Agents ckning online need inly look at chain for tasks\r\n - [ ] Put \"I'm a sign not a cop\" somewhere, seems like there is content to be organized\r\n - https://github.com/intel/dffml/blob/alice/docs/arch/alice/discussion/0036/reply_0022.md\r\n - [ ] Find a place for more background on the mental model and perhaps tie in the InnerSource example as how we determine if Alice is working on the right stuff (aligned with her strategic principles) when she is the org, and she's running multiple engagements. (system context? or is that overloaded, probably the tie in with the innersource stuff here becomes it's own tutorial).\r\n - https://github.com/intel/dffml/blob/alice/docs/arch/alice/discussion/0036/reply_0062.md\r\n - https://github.com/intel/dffml/issues/1287\r\n - [ ] Work on teaching Alice to use the shell / capture context https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0003_a_shell_for_a_ghost.md\r\n - Future\r\n - [ ] Reference current content on capturing shell commands and context might be better off in Coach Alice where we want to record analyze and detect failure patterns across sessions / devs so that we can not work down known bad paths.\r\n - Revisit dataflows from bash line analysis tie in with consoletest (that refactor stalled out :grimacing:)\r\n - https://github.com/tmux-python/tmuxp\r\n - [ ] Alice, please summarize meeting notes\r\n - [ ] and send as toot to Mastodon thread\r\n - Context awareness overlays for\r\n - Mastodon\r\n - server\r\n - handle\r\n - password or token"
},
{
"body": "## 2022-11-14 SCITT Meeting Notes\r\n\r\n- https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#heading=h.214jg0n2xjhp\r\n- From Hannes Tschofenig to Everyone 08:02 AM\r\n - > - IoT device onboarding\r\n > - https://fidoalliance.org/specs/FDO/FIDO-Device-Onboard-PS-v1.1-20220419/FIDO-Device-Onboard-PS-v1.1-20220419.html\r\n > - http://www.openmobilealliance.org/release/LightweightM2M/V1_2-20201110-A/HTML-Version/OMA-TS-LightweightM2M_Core-V1_2-20201110-A.html\r\n > - http://www.openmobilealliance.org/release/LightweightM2M/V1_2-20201110-A/HTML-Version/OMA-TS-LightweightM2M_Transport-V1_2-20201110-A.html\r\n- NED IS HERE TODAY WOOHOO!!! He replied on the mailing list yesterday. John\r\n was stoked about that too. His involvement coming from IETF RATS to align on\r\n terminology is a good thing, since he's engaging in this train of thought.\r\n - See depth of field mapping.\r\n- Neil\r\n - Involved in Inernet to identity conference\r\n - Interested in way tot get firm attestations from people about documents\r\n - Worked at Bell labs and was involved in IETF security area in the 90s\r\n- Some refactoring needed on various docs\r\n- Hanes's use case document used as good example for what we are trying to do\r\n - Need more problem statement before going into solution space.\r\n - Recommendation: Use laymans terms, do not use solution terminology within\r\n use case docs and requirements and architecture and threat model.\r\n - There are some overloaded terms in the architecture terminology.\r\n - Some attestation endorsements (signed statement about the item or asset)\r\n - Some overlay in terms of is it an endorsement or is it something different.\r\n - What is the value add that attestation is already a starting point.\r\n - If the use case was already written to assume the attestation use case.\r\n - 3rd party attestation is an endorsement in RATS\r\n - https://www.rfc-editor.org/rfc/rfc7744\r\n - Use Cases for Authentication and Authorization in Constrained Environments\r\n - > ```\r\n > Table of Contents\r\n >\r\n > 1. Introduction ....................................................4\r\n > 1.1. Terminology ................................................4\r\n > 2. Use Cases .......................................................5\r\n > 2.1. Container Monitoring .......................................5\r\n > 2.1.1. Bananas for Munich ..................................6\r\n > 2.1.2. Authorization Problems Summary ......................7\r\n > 2.2. Home Automation ............................................8\r\n > 2.2.1. Controlling the Smart Home Infrastructure ...........8\r\n > 2.2.2. Seamless Authorization ..............................8\r\n > 2.2.3. Remotely Letting in a Visitor .......................9\r\n > 2.2.4. Selling the House ...................................9\r\n > 2.2.5. Authorization Problems Summary ......................9\r\n > 2.3. Personal Health Monitoring ................................10\r\n > 2.3.1. John and the Heart Rate Monitor ....................11\r\n > 2.3.2. Authorization Problems Summary .....................12\r\n > 2.4. Building Automation .......................................13\r\n > 2.4.1. Device Life Cycle ..................................13\r\n > 2.4.1.1. Installation and Commissioning ............13\r\n > 2.4.1.2. Operational ...............................14\r\n > 2.4.1.3. Maintenance ...............................15\r\n > 2.4.1.4. Recommissioning ...........................16\r\n > 2.4.1.5. Decommissioning ...........................16\r\n > 2.4.2. Public Safety ......................................17\r\n > 2.4.2.1. A Fire Breaks Out .........................17\r\n > 2.4.3. Authorization Problems Summary .....................18\r\n > 2.5. Smart Metering ............................................19\r\n > 2.5.1. Drive-By Metering ..................................19\r\n > 2.5.2. Meshed Topology ....................................20\r\n > 2.5.3. Advanced Metering Infrastructure ...................20\r\n > 2.5.4. Authorization Problems Summary .....................21\r\n > 2.6. Sports and Entertainment ..................................22\r\n > 2.6.1. Dynamically Connecting Smart Sports Equipment ......22\r\n > 2.6.2. Authorization Problems Summary .....................23\r\n > 2.7. Industrial Control Systems ................................23\r\n > 2.7.1. Oil Platform Control ...............................23\r\n > 2.7.2. Authorization Problems Summary .....................24\r\n > 3. Security Considerations ........................................24\r\n > 3.1. Attacks ...................................................25\r\n > 3.2. Configuration of Access Permissions .......................26\r\n > 3.3. Authorization Considerations ..............................26\r\n > 3.4. Proxies ...................................................28\r\n > 4. Privacy Considerations .........................................28\r\n > 5. Informative References .........................................28\r\n > Acknowledgments ...................................................29\r\n > Authors' Addresses ................................................30\r\n > ```\r\n- We need to address Ned's Concern to define what is the clear scope of the difference\r\n between what IETF RATS attestation offers.\r\n- Sylvan\r\n - Concete senario using confideniation conpute\r\n - Using hardware attestiaont reprots abou CCF running in the cloud\r\n - Say you're running a workload you are running it in the cloud\r\n - Covidential containers which covers the VMs, hostdata MRconfig, policy used to say what you can run on that utility VM, it has a hardware attestqation story and follow the RATs spec.\r\n - This can be passed out to anyone to verify and validate the workload is what it was based on measurment\r\n - Now you don't want a precisou hash of mrenclave on TXD, it's fine to run whatever as long as it's signed by a specific issuer, that given endoresements, I might be handed a signature on an image the provider might give a different signed image to someone elese, what SCITT does (verifter policy, UVM hash percides SCITT receipt valation and feed for UVM which is the feed that identifies its purpose, from this parent on this __ of this scitt image, expect the payload in COSE.1 to be the hash that you would find measured from the TPM\r\n - I want to be able to attest container application, webaps, whatever\r\n - Can the attestation not report on that?\r\n - Sylvan has a parcitular view on how you report on the confidential containre and how it attests to the workload\r\n - If we want to talk just about the base VM boot image, how do I make sure my provider can give me version 1.2 without breaking my workload and I get transparency over every workload that can run (policy) and I have recpeits of it happening\r\n - As a verifier you only want to use a service if it's running code I crea bout\r\n - If I trust an abitrary signer, then I can rely on signature alone\r\n - But if I want SCITT, it's because I want auditable evidence then I want a recpeit that is universally verifiable, you can hand it off to customers to prove that I ran in a confidential compute environemnt.\r\n - Ned says why do I need that? I have the precise hashs?\r\n - We have a post hoc auditability gaurintee because it's transparence\r\n- RATS\r\n - Reference value providers\r\n - Verifier's job is to do a mapping between the two\r\n - The endorsement is a claim that is true if the reference value is matched to evidence\r\n - Those cliams might be added to accpted set if there was some Notary specific data\r\n - Verifier has a policy that says I trust this endorser and I don't trust that one.\r\n - SCITT Transparency is one thing we layer on top\r\n - By binding endorsers though audit log we allow for online detection, but most people might do it after the fact (post-hoc, ad-hoc post?)\r\n- Attestation letter will be defined by CISA, this will become an artifact the customer will receive\r\n - How could they do that using RATS?\r\n - An attestation letter in this case sounds more like an endorsement (I'm a good guy, trust me, these are the things that I do)\r\n - SW vendor makes a claim, customer needs to be able to verify the trustworthyness of that claim\r\n - ISO9000, he I'm following this process, are there auditors to make sure you're following it?\r\n - Customers might go to SCITT and say has anyone looked at this thing? Is it trustworthy?\r\n - This is why DFFML cares about SCITT, because of adding data about development lifecycle processes to act as a self audit capability for people to run different static analysis (meta static analysis)\r\n - Is there a blocking process to get on the regisistry? no! (federation, DID based flat file, we can build all this offline and join disparate roots later, this is why we like SCITT)\r\n - Other parties can endorse and make transparnt their endorsements (notary step)\r\n - Registration policy controls what signed statemtnst can be made transparent, it can alos say who can put signed statemtenst in (OIDC) and make them transparent via this instance\r\n - We want to enable additional audutors to audit each other, they make additional statemtnst, sign those statemtnst and make them transpacent via the SCITT log they submit to\r\n - This allows us to go N level and N link on the graph deep in terms of what set we want to define we \"trusted\"\r\n- SW produces package\r\n - 3rd party produces endorsement about produced package (a 2nd witness of that claim)\r\n - Ned says this is possible with RATS, the thing it doesn't try to define is that you\r\n have to have that, they would call that an \"appraisal policy\", the you have to have\r\n this second entity (Alice? ;) doing the tests.\r\n - SCITT is saying those claims have to be published somewhere (even self with Alice\r\n offline case).\r\n - What value do those additional witnesses bring?\r\n - Existance of a recpeit is proof that signed claims were made and made in a\r\n specific order, they are tamperproof (rather than just tampter evident).\r\n - With transpanecy I can accept an update, and know I can check later,\r\n if they lie, I can go find out that they lied.\r\n- TODO\r\n - [ ] Section on Federation (8)\r\n - [SCITT API Emulator Bring Up](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4110695)\r\n - We upload `alice shouldi contribute` dataflow to SCITT and get a receipt!\r\n - Friends, today is a great day. :railway_track:\r\n - Next stop, serialization / federation with Alice / Open Architecture serialization data flow as SCITT service.\r\n - Started with mermaid added in https://github.com/intel/dffml/commit/fbcbc86b5c52932bccf4cd6321f4e79f60ad3023 to https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0002_shes_ariving_when.md\r\n - [ ] Use case documents\r\n - [ ] OpenSSF Metrics\r\n - Use Microsoft SCITT API Emulator (MIT) as upstream / reference\r\n implementation. Talk about how to used the data provenance on the workflow\r\n (`alice shouldi contribute`).\r\n - We can then start doing the `did:merkle:` what do I care about itermediate\r\n representation to do cross platform (jenkins, github actions, etc.) caching\r\n / analysis of caching / please contributed streamlined.\r\n - Play with OIDC and SCITT\r\n - Later show overlayed flow on top of upstream (OpenSSF metrics or something\r\n ideally would be the upstream defining these flows, probably, in most cases).\r\n - Need to patch dataflows to include `upstream` as flows / system context\r\n it came from if overlayed."
},
{
"body": "# Alice, should I contribute? Data Flow\r\n\r\nCross post: https://gist.github.com/pdxjohnny/57b049c284e58f51d0a0d35d05d03d4a\r\nCross post: https://github.com/intel/dffml/discussions/1382#discussioncomment-4141177\r\nCross post: https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4141183\r\nUpstream: https://github.com/intel/dffml/tree/8847989eb4cc9f6aa484285ba9c11ff920113ed3\r\n\r\n```console\r\n$ export TITLE=\"Alice, should I contribute? Data Flow (upstream: https://github.com/intel/dffml/tree/8847989eb4cc9f6aa484285ba9c11ff920113ed3)\";\r\n$ (echo \"${TITLE}\" \\\r\n && echo \\\r\n && python -um dffml service dev export alice.cli:ALICE_COLLECTOR_DATAFLOW > alice_shouldi_contribute.json \\\r\n && echo '```mermaid' \\\r\n && python -um dffml dataflow diagram -stage processing -configloader json alice_shouldi_contribute.json \\\r\n && echo '```' \\\r\n && echo \\\r\n && echo '```yaml' \\\r\n && python -c \"import sys, pathlib, json, yaml; print(yaml.dump(json.load(sys.stdin)))\" < alice_shouldi_contribute.json \\\r\n && echo '```' \\\r\n && echo) \\\r\n | gh gist create --public --desc \"${TITLE}\" -f ALICE_SHOULDI_CONTRIBUTE_THREATS.md -\r\n```\r\n\r\n```mermaid\r\ngraph TD\r\nsubgraph d3ec0ac85209a7256c89d20f758f09f4[check_if_valid_git_repository_URL]\r\nstyle d3ec0ac85209a7256c89d20f758f09f4 fill:#fff4de,stroke:#cece71\r\nf577c71443f6b04596b3fe0511326c40[check_if_valid_git_repository_URL]\r\n7440e73a8e8f864097f42162b74f2762(URL)\r\n7440e73a8e8f864097f42162b74f2762 --> f577c71443f6b04596b3fe0511326c40\r\n8e39b501b41c5d0e4596318f80a03210(valid)\r\nf577c71443f6b04596b3fe0511326c40 --> 8e39b501b41c5d0e4596318f80a03210\r\nend\r\nsubgraph af8da22d1318d911f29b95e687f87c5d[clone_git_repo]\r\nstyle af8da22d1318d911f29b95e687f87c5d fill:#fff4de,stroke:#cece71\r\n155b8fdb5524f6bfd5adbae4940ad8d5[clone_git_repo]\r\need77b9eea541e0c378c67395351099c(URL)\r\need77b9eea541e0c378c67395351099c --> 155b8fdb5524f6bfd5adbae4940ad8d5\r\n8b5928cd265dd2c44d67d076f60c8b05(ssh_key)\r\n8b5928cd265dd2c44d67d076f60c8b05 --> 155b8fdb5524f6bfd5adbae4940ad8d5\r\n4e1d5ea96e050e46ebf95ebc0713d54c(repo)\r\n155b8fdb5524f6bfd5adbae4940ad8d5 --> 4e1d5ea96e050e46ebf95ebc0713d54c\r\n6a44de06a4a3518b939b27c790f6cdce{valid_git_repository_URL}\r\n6a44de06a4a3518b939b27c790f6cdce --> 155b8fdb5524f6bfd5adbae4940ad8d5\r\nend\r\nsubgraph d367039fa2c485f55058105e7e0c0b6b[count_authors]\r\nstyle d367039fa2c485f55058105e7e0c0b6b fill:#fff4de,stroke:#cece71\r\n70c47962ba601f0df1890f4c72ae1b54[count_authors]\r\n0637dcbe07cd05b96d0a6a2dfbb0c5ff(author_lines)\r\n0637dcbe07cd05b96d0a6a2dfbb0c5ff --> 70c47962ba601f0df1890f4c72ae1b54\r\ne1d1567e6b3a3e5d899b9543c693a66f(authors)\r\n70c47962ba601f0df1890f4c72ae1b54 --> e1d1567e6b3a3e5d899b9543c693a66f\r\nend\r\nsubgraph 7c3ab755010b5134c7c3c5be9fed1f1c[dffml_feature_git.feature.operations:git_grep]\r\nstyle 7c3ab755010b5134c7c3c5be9fed1f1c fill:#fff4de,stroke:#cece71\r\n7155c0a875a889898d6d6e0c7959649b[dffml_feature_git.feature.operations:git_grep]\r\n1fc5390b128a11a95280a89ad371a5ae(repo)\r\n1fc5390b128a11a95280a89ad371a5ae --> 7155c0a875a889898d6d6e0c7959649b\r\ncc134251a8bdd1d0944ea69eafc239a4(search)\r\ncc134251a8bdd1d0944ea69eafc239a4 --> 7155c0a875a889898d6d6e0c7959649b\r\n8b7a73c5b4f92ff7fb362de5d8e90b3e(found)\r\n7155c0a875a889898d6d6e0c7959649b --> 8b7a73c5b4f92ff7fb362de5d8e90b3e\r\nend\r\nsubgraph 2863a5f2869f0187864ff7a8afcbc2f5[dffml_operations_innersource.cli:ensure_tokei]\r\nstyle 2863a5f2869f0187864ff7a8afcbc2f5 fill:#fff4de,stroke:#cece71\r\na7fe94e6e97c131edebbf73cca7b8852[dffml_operations_innersource.cli:ensure_tokei]\r\n3f6fe14c9392820b8562f809c7e2b8b4(result)\r\na7fe94e6e97c131edebbf73cca7b8852 --> 3f6fe14c9392820b8562f809c7e2b8b4\r\nend\r\nsubgraph 1f8d333356c8981dfc553c7eb00bf366[dffml_operations_innersource.cli:github_repo_id_to_clone_url]\r\nstyle 1f8d333356c8981dfc553c7eb00bf366 fill:#fff4de,stroke:#cece71\r\n859feff15e5487fdad83ec4c42c506e7[dffml_operations_innersource.cli:github_repo_id_to_clone_url]\r\nd2bc011260868bff46d1a206c404a549(repo_id)\r\nd2bc011260868bff46d1a206c404a549 --> 859feff15e5487fdad83ec4c42c506e7\r\n1f6ba749c4b65c55218b968bf308e4e2(result)\r\n859feff15e5487fdad83ec4c42c506e7 --> 1f6ba749c4b65c55218b968bf308e4e2\r\nend\r\nsubgraph f2b87480bbba5729364d76ad2fd5ef17[dffml_operations_innersource.operations:action_yml_files]\r\nstyle f2b87480bbba5729364d76ad2fd5ef17 fill:#fff4de,stroke:#cece71\r\n4de0ba6484f92eba7073404d21fb3598[dffml_operations_innersource.operations:action_yml_files]\r\n847cd99cca177936d533aaa4918c6699(repo)\r\n847cd99cca177936d533aaa4918c6699 --> 4de0ba6484f92eba7073404d21fb3598\r\n7fa0f9133dfd9f00a90383b38c2ec840(result)\r\n4de0ba6484f92eba7073404d21fb3598 --> 7fa0f9133dfd9f00a90383b38c2ec840\r\nend\r\nsubgraph 98179e1c9444a758d9565431f371b232[dffml_operations_innersource.operations:code_of_conduct_present]\r\nstyle 98179e1c9444a758d9565431f371b232 fill:#fff4de,stroke:#cece71\r\nfb772128fdc785ce816c73128e0afd4d[dffml_operations_innersource.operations:code_of_conduct_present]\r\nf333b126c62bdbf832dddf105278d218(repo)\r\nf333b126c62bdbf832dddf105278d218 --> fb772128fdc785ce816c73128e0afd4d\r\n1233aac886e50641252dcad2124003c9(result)\r\nfb772128fdc785ce816c73128e0afd4d --> 1233aac886e50641252dcad2124003c9\r\nend\r\nsubgraph d03657cbeff4a7501071526c5227d605[dffml_operations_innersource.operations:contributing_present]\r\nstyle d03657cbeff4a7501071526c5227d605 fill:#fff4de,stroke:#cece71\r\n8da2c8a3eddf27e38838c8b6a2cd4ad1[dffml_operations_innersource.operations:contributing_present]\r\n2a1ae8bcc9add3c42e071d0557e98b1c(repo)\r\n2a1ae8bcc9add3c42e071d0557e98b1c --> 8da2c8a3eddf27e38838c8b6a2cd4ad1\r\n52544c54f59ff4838d42ba3472b02589(result)\r\n8da2c8a3eddf27e38838c8b6a2cd4ad1 --> 52544c54f59ff4838d42ba3472b02589\r\nend\r\nsubgraph 3ac62bbb02d944121299b756fc806782[dffml_operations_innersource.operations:get_current_datetime_as_git_date]\r\nstyle 3ac62bbb02d944121299b756fc806782 fill:#fff4de,stroke:#cece71\r\n913421183cb3f7803fb82a12e4ee711f[dffml_operations_innersource.operations:get_current_datetime_as_git_date]\r\ne17cbcbbf2d11ed5ce43603779758076(result)\r\n913421183cb3f7803fb82a12e4ee711f --> e17cbcbbf2d11ed5ce43603779758076\r\nend\r\nsubgraph 5827679f9c689590302b3f46277551ec[dffml_operations_innersource.operations:github_workflows]\r\nstyle 5827679f9c689590302b3f46277551ec fill:#fff4de,stroke:#cece71\r\n160833350a633bb60ee3880fb824189e[dffml_operations_innersource.operations:github_workflows]\r\ncaaae91348f7c892daa1d05fbd221352(repo)\r\ncaaae91348f7c892daa1d05fbd221352 --> 160833350a633bb60ee3880fb824189e\r\n882be05f5b4ede0846177f68fc70cfd4(result)\r\n160833350a633bb60ee3880fb824189e --> 882be05f5b4ede0846177f68fc70cfd4\r\nend\r\nsubgraph f1a14368132c9536201d6260d7fc6b63[dffml_operations_innersource.operations:groovy_files]\r\nstyle f1a14368132c9536201d6260d7fc6b63 fill:#fff4de,stroke:#cece71\r\nd86d2384b02c75979f3a21818187764e[dffml_operations_innersource.operations:groovy_files]\r\n37b63c13bc63cddeaba57cee5dc3f613(repo)\r\n37b63c13bc63cddeaba57cee5dc3f613 --> d86d2384b02c75979f3a21818187764e\r\n6e31b041bad7c24fa5b0a793ff20890b(result)\r\nd86d2384b02c75979f3a21818187764e --> 6e31b041bad7c24fa5b0a793ff20890b\r\nend\r\nsubgraph 49272b4d054d834d0dfd08d62360a489[dffml_operations_innersource.operations:jenkinsfiles]\r\nstyle 49272b4d054d834d0dfd08d62360a489 fill:#fff4de,stroke:#cece71\r\na31545bdef7e66159d0b56861e4a4fa3[dffml_operations_innersource.operations:jenkinsfiles]\r\n449ec8a512ad1a002c5bbbd0fc8294e9(repo)\r\n449ec8a512ad1a002c5bbbd0fc8294e9 --> a31545bdef7e66159d0b56861e4a4fa3\r\n4963673c5f8ef045573769c58fc54a77(result)\r\na31545bdef7e66159d0b56861e4a4fa3 --> 4963673c5f8ef045573769c58fc54a77\r\nend\r\nsubgraph 3ab6f933ff2c5d1c31f5acce50ace507[dffml_operations_innersource.operations:readme_present]\r\nstyle 3ab6f933ff2c5d1c31f5acce50ace507 fill:#fff4de,stroke:#cece71\r\nae6634d141e4d989b0f53fd3b849b101[dffml_operations_innersource.operations:readme_present]\r\n4d289d268d52d6fb5795893363300585(repo)\r\n4d289d268d52d6fb5795893363300585 --> ae6634d141e4d989b0f53fd3b849b101\r\n65fd35d17d8a7e96c9f7e6aaedb75e3c(result)\r\nae6634d141e4d989b0f53fd3b849b101 --> 65fd35d17d8a7e96c9f7e6aaedb75e3c\r\nend\r\nsubgraph da39b149b9fed20f273450b47a0b65f4[dffml_operations_innersource.operations:security_present]\r\nstyle da39b149b9fed20f273450b47a0b65f4 fill:#fff4de,stroke:#cece71\r\nc8921544f4665e73080cb487aef7de94[dffml_operations_innersource.operations:security_present]\r\ne682bbcfad20caaab15e4220c81e9239(repo)\r\ne682bbcfad20caaab15e4220c81e9239 --> c8921544f4665e73080cb487aef7de94\r\n5d69c4e5b3601abbd692ade806dcdf5f(result)\r\nc8921544f4665e73080cb487aef7de94 --> 5d69c4e5b3601abbd692ade806dcdf5f\r\nend\r\nsubgraph 062b8882104862540d584516edc60008[dffml_operations_innersource.operations:support_present]\r\nstyle 062b8882104862540d584516edc60008 fill:#fff4de,stroke:#cece71\r\n5cc75c20aee40e815abf96726508b66d[dffml_operations_innersource.operations:support_present]\r\nf0e4cd91ca4f6b278478180a188a2f5f(repo)\r\nf0e4cd91ca4f6b278478180a188a2f5f --> 5cc75c20aee40e815abf96726508b66d\r\n46bd597a57e034f669df18ac9ae0a153(result)\r\n5cc75c20aee40e815abf96726508b66d --> 46bd597a57e034f669df18ac9ae0a153\r\nend\r\nsubgraph 208d072a660149b8e7b7e55de1b6d4dd[git_commits]\r\nstyle 208d072a660149b8e7b7e55de1b6d4dd fill:#fff4de,stroke:#cece71\r\n90b953c5527ed3a579912eea8b02b1be[git_commits]\r\ne0d40a3d87e4946fdf517eaa40848e39(branch)\r\ne0d40a3d87e4946fdf517eaa40848e39 --> 90b953c5527ed3a579912eea8b02b1be\r\n44051d3d0587f293a2f36fb2fca3986e(repo)\r\n44051d3d0587f293a2f36fb2fca3986e --> 90b953c5527ed3a579912eea8b02b1be\r\n80b9ea20367299aca462989eb0356ccf(start_end)\r\n80b9ea20367299aca462989eb0356ccf --> 90b953c5527ed3a579912eea8b02b1be\r\nf75e51a2fca4258c207b5473f62e53e0(commits)\r\n90b953c5527ed3a579912eea8b02b1be --> f75e51a2fca4258c207b5473f62e53e0\r\nend\r\nsubgraph a6fadf4f2f5031106e26cfc42fa08fcd[git_repo_author_lines_for_dates]\r\nstyle a6fadf4f2f5031106e26cfc42fa08fcd fill:#fff4de,stroke:#cece71\r\n0afa2b3dbc72afa67170525d1d7532d7[git_repo_author_lines_for_dates]\r\n3396a58cd186eda4908308395f2421c4(branch)\r\n3396a58cd186eda4908308395f2421c4 --> 0afa2b3dbc72afa67170525d1d7532d7\r\n5ca6153629c6af49e61eb6d5c95c64f2(repo)\r\n5ca6153629c6af49e61eb6d5c95c64f2 --> 0afa2b3dbc72afa67170525d1d7532d7\r\nfef3455ecf4fc7a993cb14c43d4d345f(start_end)\r\nfef3455ecf4fc7a993cb14c43d4d345f --> 0afa2b3dbc72afa67170525d1d7532d7\r\n3bf05667f7df95bb2ae3b614ea998cff(author_lines)\r\n0afa2b3dbc72afa67170525d1d7532d7 --> 3bf05667f7df95bb2ae3b614ea998cff\r\nend\r\nsubgraph 2a6fb4d7ae016ca95fcfc061d3d1b8ab[git_repo_checkout]\r\nstyle 2a6fb4d7ae016ca95fcfc061d3d1b8ab fill:#fff4de,stroke:#cece71\r\n02de40331374616f64ba4a92fbb33edd[git_repo_checkout]\r\n2b82220f7c12c2e39d2dd6330ec875bd(commit)\r\n2b82220f7c12c2e39d2dd6330ec875bd --> 02de40331374616f64ba4a92fbb33edd\r\n95dc6c133455588bd30b1116c857b624(repo)\r\n95dc6c133455588bd30b1116c857b624 --> 02de40331374616f64ba4a92fbb33edd\r\nc762e289fa4f1cd4c4d96b57422f2a81(repo)\r\n02de40331374616f64ba4a92fbb33edd --> c762e289fa4f1cd4c4d96b57422f2a81\r\nend\r\nsubgraph d9401f19394958bb1ad2dd4dfc37fa79[git_repo_commit_from_date]\r\nstyle d9401f19394958bb1ad2dd4dfc37fa79 fill:#fff4de,stroke:#cece71\r\n7bbb97768b34f207c34c1f4721708675[git_repo_commit_from_date]\r\nba10b1d34771f904ff181cb361864ab2(branch)\r\nba10b1d34771f904ff181cb361864ab2 --> 7bbb97768b34f207c34c1f4721708675\r\n13e4349f6f7f4c9f65ae38767fab1bd5(date)\r\n13e4349f6f7f4c9f65ae38767fab1bd5 --> 7bbb97768b34f207c34c1f4721708675\r\n0c19b6fe88747ef09defde05a60e8d84(repo)\r\n0c19b6fe88747ef09defde05a60e8d84 --> 7bbb97768b34f207c34c1f4721708675\r\n4941586112b4011d0c72c6264b816db4(commit)\r\n7bbb97768b34f207c34c1f4721708675 --> 4941586112b4011d0c72c6264b816db4\r\nend\r\nsubgraph d3d91578caf34c0ae944b17853783406[git_repo_default_branch]\r\nstyle d3d91578caf34c0ae944b17853783406 fill:#fff4de,stroke:#cece71\r\n546062a96122df465d2631f31df4e9e3[git_repo_default_branch]\r\n181f1b33df4d795fbad2911ec7087e86(repo)\r\n181f1b33df4d795fbad2911ec7087e86 --> 546062a96122df465d2631f31df4e9e3\r\n57651c1bcd24b794dfc8d1794ab556d5(branch)\r\n546062a96122df465d2631f31df4e9e3 --> 57651c1bcd24b794dfc8d1794ab556d5\r\n5ed1ab77e726d7efdcc41e9e2f8039c6(remote)\r\n546062a96122df465d2631f31df4e9e3 --> 5ed1ab77e726d7efdcc41e9e2f8039c6\r\n4c3cdd5f15b7a846d291aac089e8a622{no_git_branch_given}\r\n4c3cdd5f15b7a846d291aac089e8a622 --> 546062a96122df465d2631f31df4e9e3\r\nend\r\nsubgraph f9155f693f3d5c1dd132e4f9e32175b8[git_repo_release]\r\nstyle f9155f693f3d5c1dd132e4f9e32175b8 fill:#fff4de,stroke:#cece71\r\nf01273bde2638114cff25a747963223e[git_repo_release]\r\na5df26b9f1fb4360aac38ee7ad6c5041(branch)\r\na5df26b9f1fb4360aac38ee7ad6c5041 --> f01273bde2638114cff25a747963223e\r\n84255574141c7ee6735c88c70cb4dc35(repo)\r\n84255574141c7ee6735c88c70cb4dc35 --> f01273bde2638114cff25a747963223e\r\nb2e4d6aa4a5bfba38584dc028dfc35b8(start_end)\r\nb2e4d6aa4a5bfba38584dc028dfc35b8 --> f01273bde2638114cff25a747963223e\r\n2cd7c2339d5e783198a219f02af0240a(present)\r\nf01273bde2638114cff25a747963223e --> 2cd7c2339d5e783198a219f02af0240a\r\nend\r\nsubgraph b121cc70dccc771127b429709d55d6d5[lines_of_code_by_language]\r\nstyle b121cc70dccc771127b429709d55d6d5 fill:#fff4de,stroke:#cece71\r\nef6d613ca7855a13865933156c79ddea[lines_of_code_by_language]\r\n0b781c240b2945323081606938fdf136(repo)\r\n0b781c240b2945323081606938fdf136 --> ef6d613ca7855a13865933156c79ddea\r\ne51defd3debc1237bf64e6ae611595f7(lines_by_language)\r\nef6d613ca7855a13865933156c79ddea --> e51defd3debc1237bf64e6ae611595f7\r\nf5eb786f700f1aefd37023db219961a1{str}\r\nf5eb786f700f1aefd37023db219961a1 --> ef6d613ca7855a13865933156c79ddea\r\nend\r\nsubgraph 35551a739c7d12be0fed88e1d92a296c[lines_of_code_to_comments]\r\nstyle 35551a739c7d12be0fed88e1d92a296c fill:#fff4de,stroke:#cece71\r\nb6e1f853d077365deddea22b2fdb890d[lines_of_code_to_comments]\r\n669759049f3ac6927280566ef45cf980(langs)\r\n669759049f3ac6927280566ef45cf980 --> b6e1f853d077365deddea22b2fdb890d\r\n850cdec03e4988f119a67899cbc5f311(code_to_comment_ratio)\r\nb6e1f853d077365deddea22b2fdb890d --> 850cdec03e4988f119a67899cbc5f311\r\nend\r\nsubgraph 00b5efb50d0353b48966d833eabb1757[make_quarters]\r\nstyle 00b5efb50d0353b48966d833eabb1757 fill:#fff4de,stroke:#cece71\r\n7f20bd2c94ecbd47ab6bd88673c7174f[make_quarters]\r\n89dd142dfced4933070ebf4ffaff2630(number)\r\n89dd142dfced4933070ebf4ffaff2630 --> 7f20bd2c94ecbd47ab6bd88673c7174f\r\n224e033ecd73401fc95efaa7d7fa799b(quarters)\r\n7f20bd2c94ecbd47ab6bd88673c7174f --> 224e033ecd73401fc95efaa7d7fa799b\r\nend\r\nsubgraph 87b1836daeb62eee5488373bd36b0c48[quarters_back_to_date]\r\nstyle 87b1836daeb62eee5488373bd36b0c48 fill:#fff4de,stroke:#cece71\r\n9dc9f9feff38d8f5dd9388d3a60e74c0[quarters_back_to_date]\r\n00bf6f65f7fa0d1ffce8e87585fae1b5(date)\r\n00bf6f65f7fa0d1ffce8e87585fae1b5 --> 9dc9f9feff38d8f5dd9388d3a60e74c0\r\n8a2fb544746a0e8f0a8984210e6741dc(number)\r\n8a2fb544746a0e8f0a8984210e6741dc --> 9dc9f9feff38d8f5dd9388d3a60e74c0\r\ncf114d5eea4795cef497592d0632bad7(date)\r\n9dc9f9feff38d8f5dd9388d3a60e74c0 --> cf114d5eea4795cef497592d0632bad7\r\n9848c2c8981da29ca1cbce32c1a4e457(start_end)\r\n9dc9f9feff38d8f5dd9388d3a60e74c0 --> 9848c2c8981da29ca1cbce32c1a4e457\r\nend\r\nsubgraph 6d61616898ab2c6024fd2a04faba8e02[work]\r\nstyle 6d61616898ab2c6024fd2a04faba8e02 fill:#fff4de,stroke:#cece71\r\n67e92c8765a9bc7fb2d335c459de9eb5[work]\r\n91794b0e2b5307720bed41f22724c339(author_lines)\r\n91794b0e2b5307720bed41f22724c339 --> 67e92c8765a9bc7fb2d335c459de9eb5\r\n8fd602a64430dd860b0a280217d8ccef(work)\r\n67e92c8765a9bc7fb2d335c459de9eb5 --> 8fd602a64430dd860b0a280217d8ccef\r\nend\r\n1f6ba749c4b65c55218b968bf308e4e2 --> 7440e73a8e8f864097f42162b74f2762\r\n7ec43cbbf66e6d893180645d5e929bb4(seed<br>URL)\r\nstyle 7ec43cbbf66e6d893180645d5e929bb4 fill:#f6dbf9,stroke:#a178ca\r\n7ec43cbbf66e6d893180645d5e929bb4 --> 7440e73a8e8f864097f42162b74f2762\r\n1f6ba749c4b65c55218b968bf308e4e2 --> eed77b9eea541e0c378c67395351099c\r\n7ec43cbbf66e6d893180645d5e929bb4(seed<br>URL)\r\nstyle 7ec43cbbf66e6d893180645d5e929bb4 fill:#f6dbf9,stroke:#a178ca\r\n7ec43cbbf66e6d893180645d5e929bb4 --> eed77b9eea541e0c378c67395351099c\r\na6ed501edbf561fda49a0a0a3ca310f0(seed<br>git_repo_ssh_key)\r\nstyle a6ed501edbf561fda49a0a0a3ca310f0 fill:#f6dbf9,stroke:#a178ca\r\na6ed501edbf561fda49a0a0a3ca310f0 --> 8b5928cd265dd2c44d67d076f60c8b05\r\n8e39b501b41c5d0e4596318f80a03210 --> 6a44de06a4a3518b939b27c790f6cdce\r\n3bf05667f7df95bb2ae3b614ea998cff --> 0637dcbe07cd05b96d0a6a2dfbb0c5ff\r\n4e1d5ea96e050e46ebf95ebc0713d54c --> 1fc5390b128a11a95280a89ad371a5ae\r\n0690fdb25283b1e0a09016a28aa08c08(seed<br>git_grep_search)\r\nstyle 0690fdb25283b1e0a09016a28aa08c08 fill:#f6dbf9,stroke:#a178ca\r\n0690fdb25283b1e0a09016a28aa08c08 --> cc134251a8bdd1d0944ea69eafc239a4\r\n090b151d70cc5b37562b42c64cb16bb0(seed<br>GitHubRepoID)\r\nstyle 090b151d70cc5b37562b42c64cb16bb0 fill:#f6dbf9,stroke:#a178ca\r\n090b151d70cc5b37562b42c64cb16bb0 --> d2bc011260868bff46d1a206c404a549\r\nc762e289fa4f1cd4c4d96b57422f2a81 --> 847cd99cca177936d533aaa4918c6699\r\nc762e289fa4f1cd4c4d96b57422f2a81 --> f333b126c62bdbf832dddf105278d218\r\nc762e289fa4f1cd4c4d96b57422f2a81 --> 2a1ae8bcc9add3c42e071d0557e98b1c\r\nc762e289fa4f1cd4c4d96b57422f2a81 --> caaae91348f7c892daa1d05fbd221352\r\nc762e289fa4f1cd4c4d96b57422f2a81 --> 37b63c13bc63cddeaba57cee5dc3f613\r\nc762e289fa4f1cd4c4d96b57422f2a81 --> 449ec8a512ad1a002c5bbbd0fc8294e9\r\nc762e289fa4f1cd4c4d96b57422f2a81 --> 4d289d268d52d6fb5795893363300585\r\nc762e289fa4f1cd4c4d96b57422f2a81 --> e682bbcfad20caaab15e4220c81e9239\r\nc762e289fa4f1cd4c4d96b57422f2a81 --> f0e4cd91ca4f6b278478180a188a2f5f\r\n57651c1bcd24b794dfc8d1794ab556d5 --> e0d40a3d87e4946fdf517eaa40848e39\r\n4e1d5ea96e050e46ebf95ebc0713d54c --> 44051d3d0587f293a2f36fb2fca3986e\r\n9848c2c8981da29ca1cbce32c1a4e457 --> 80b9ea20367299aca462989eb0356ccf\r\n57651c1bcd24b794dfc8d1794ab556d5 --> 3396a58cd186eda4908308395f2421c4\r\n4e1d5ea96e050e46ebf95ebc0713d54c --> 5ca6153629c6af49e61eb6d5c95c64f2\r\n9848c2c8981da29ca1cbce32c1a4e457 --> fef3455ecf4fc7a993cb14c43d4d345f\r\n4941586112b4011d0c72c6264b816db4 --> 2b82220f7c12c2e39d2dd6330ec875bd\r\n4e1d5ea96e050e46ebf95ebc0713d54c --> 95dc6c133455588bd30b1116c857b624\r\n57651c1bcd24b794dfc8d1794ab556d5 --> ba10b1d34771f904ff181cb361864ab2\r\ncf114d5eea4795cef497592d0632bad7 --> 13e4349f6f7f4c9f65ae38767fab1bd5\r\n4e1d5ea96e050e46ebf95ebc0713d54c --> 0c19b6fe88747ef09defde05a60e8d84\r\n4e1d5ea96e050e46ebf95ebc0713d54c --> 181f1b33df4d795fbad2911ec7087e86\r\n2334372b57604cd06ceaf611e1c4a458(no_git_branch_given)\r\n2334372b57604cd06ceaf611e1c4a458 --> 4c3cdd5f15b7a846d291aac089e8a622\r\n57651c1bcd24b794dfc8d1794ab556d5 --> a5df26b9f1fb4360aac38ee7ad6c5041\r\n4e1d5ea96e050e46ebf95ebc0713d54c --> 84255574141c7ee6735c88c70cb4dc35\r\n9848c2c8981da29ca1cbce32c1a4e457 --> b2e4d6aa4a5bfba38584dc028dfc35b8\r\nc762e289fa4f1cd4c4d96b57422f2a81 --> 0b781c240b2945323081606938fdf136\r\n3c4eda0137cefa5452a87052978523ce --> f5eb786f700f1aefd37023db219961a1\r\n176c8001e30dae223370012eeb537711 --> f5eb786f700f1aefd37023db219961a1\r\n3f6fe14c9392820b8562f809c7e2b8b4 --> f5eb786f700f1aefd37023db219961a1\r\ne51defd3debc1237bf64e6ae611595f7 --> 669759049f3ac6927280566ef45cf980\r\na8b3d979c7c66aeb3b753408c3da0976(seed<br>quarters)\r\nstyle a8b3d979c7c66aeb3b753408c3da0976 fill:#f6dbf9,stroke:#a178ca\r\na8b3d979c7c66aeb3b753408c3da0976 --> 89dd142dfced4933070ebf4ffaff2630\r\ne17cbcbbf2d11ed5ce43603779758076 --> 00bf6f65f7fa0d1ffce8e87585fae1b5\r\n224e033ecd73401fc95efaa7d7fa799b --> 8a2fb544746a0e8f0a8984210e6741dc\r\n3bf05667f7df95bb2ae3b614ea998cff --> 91794b0e2b5307720bed41f22724c339\r\n```\r\n\r\n<details>\r\n<summary>Full dataflow</summary>\r\n\r\n```yaml\r\nconfigs:\r\n dffml_operations_innersource.cli:ensure_tokei:\r\n cache_dir: .tools/open-architecture/innersource/.cache/tokei\r\n platform_urls:\r\n Darwin:\r\n expected_hash: 8c8a1d8d8dd4d8bef93dabf5d2f6e27023777f8553393e269765d7ece85e68837cba4374a2615d83f071dfae22ba40e2\r\n url: https://github.com/XAMPPRocky/tokei/releases/download/v10.1.1/tokei-v10.1.1-x86_64-apple-darwin.tar.gz\r\n Linux:\r\n expected_hash: 22699e16e71f07ff805805d26ee86ecb9b1052d7879350f7eb9ed87beb0e6b84fbb512963d01b75cec8e80532e4ea29a\r\n url: https://github.com/XAMPPRocky/tokei/releases/download/v10.1.1/tokei-v10.1.1-x86_64-unknown-linux-gnu.tar.gz\r\ndefinitions:\r\n ActionYAMLFileWorkflowUnixStylePath:\r\n links:\r\n - - - name\r\n - str\r\n - - primitive\r\n - str\r\n name: ActionYAMLFileWorkflowUnixStylePath\r\n primitive: str\r\n CICDLibrary:\r\n links:\r\n - - - name\r\n - dict\r\n - - primitive\r\n - map\r\n name: CICDLibrary\r\n primitive: dict\r\n FileCodeOfConductPresent:\r\n links:\r\n - - - name\r\n - bool\r\n - - primitive\r\n - bool\r\n name: FileCodeOfConductPresent\r\n primitive: bool\r\n FileContributingPresent:\r\n links:\r\n - - - name\r\n - bool\r\n - - primitive\r\n - bool\r\n name: FileContributingPresent\r\n primitive: bool\r\n FileReadmePresent:\r\n links:\r\n - - - name\r\n - bool\r\n - - primitive\r\n - bool\r\n name: FileReadmePresent\r\n primitive: bool\r\n FileSecurityPresent:\r\n links:\r\n - - - name\r\n - bool\r\n - - primitive\r\n - bool\r\n name: FileSecurityPresent\r\n primitive: bool\r\n FileSupportPresent:\r\n links:\r\n - - - name\r\n - bool\r\n - - primitive\r\n - bool\r\n name: FileSupportPresent\r\n primitive: bool\r\n GitHubActionsWorkflowUnixStylePath:\r\n links:\r\n - - - name\r\n - str\r\n - - primitive\r\n - str\r\n name: GitHubActionsWorkflowUnixStylePath\r\n primitive: str\r\n GitHubRepoID:\r\n links:\r\n - - - name\r\n - str\r\n - - primitive\r\n - str\r\n name: GitHubRepoID\r\n primitive: str\r\n GroovyFileWorkflowUnixStylePath:\r\n links:\r\n - - - name\r\n - str\r\n - - primitive\r\n - str\r\n name: GroovyFileWorkflowUnixStylePath\r\n primitive: str\r\n IsCICDGitHubActionsLibrary:\r\n links:\r\n - - - name\r\n - bool\r\n - - primitive\r\n - bool\r\n name: IsCICDGitHubActionsLibrary\r\n primitive: bool\r\n IsCICDJenkinsLibrary:\r\n links:\r\n - - - name\r\n - bool\r\n - - primitive\r\n - bool\r\n name: IsCICDJenkinsLibrary\r\n primitive: bool\r\n JenkinsfileWorkflowUnixStylePath:\r\n links:\r\n - - - name\r\n - str\r\n - - primitive\r\n - str\r\n name: JenkinsfileWorkflowUnixStylePath\r\n primitive: str\r\n URL:\r\n links:\r\n - - - name\r\n - str\r\n - - primitive\r\n - str\r\n name: URL\r\n primitive: str\r\n author_count:\r\n name: author_count\r\n primitive: int\r\n author_line_count:\r\n name: author_line_count\r\n primitive: Dict[str, int]\r\n bool:\r\n name: bool\r\n primitive: bool\r\n commit_count:\r\n name: commit_count\r\n primitive: int\r\n date:\r\n name: date\r\n primitive: string\r\n date_pair:\r\n name: date_pair\r\n primitive: List[date]\r\n git_branch:\r\n links:\r\n - - - name\r\n - str\r\n - - primitive\r\n - str\r\n name: git_branch\r\n primitive: str\r\n git_commit:\r\n name: git_commit\r\n primitive: string\r\n git_grep_found:\r\n name: git_grep_found\r\n primitive: string\r\n git_grep_search:\r\n name: git_grep_search\r\n primitive: string\r\n git_remote:\r\n links:\r\n - - - name\r\n - str\r\n - - primitive\r\n - str\r\n name: git_remote\r\n primitive: str\r\n git_repo_ssh_key:\r\n default: null\r\n name: git_repo_ssh_key\r\n primitive: string\r\n git_repository:\r\n lock: true\r\n name: git_repository\r\n primitive: Dict[str, str]\r\n spec:\r\n defaults:\r\n URL: null\r\n name: GitRepoSpec\r\n types:\r\n URL: str\r\n directory: str\r\n subspec: false\r\n git_repository_checked_out:\r\n lock: true\r\n name: git_repository_checked_out\r\n primitive: Dict[str, str]\r\n spec:\r\n defaults:\r\n URL: null\r\n commit: null\r\n name: GitRepoCheckedOutSpec\r\n types:\r\n URL: str\r\n commit: str\r\n directory: str\r\n subspec: false\r\n group_by_output:\r\n name: group_by_output\r\n primitive: Dict[str, List[Any]]\r\n group_by_spec:\r\n name: group_by_spec\r\n primitive: Dict[str, Any]\r\n language_to_comment_ratio:\r\n name: language_to_comment_ratio\r\n primitive: int\r\n lines_by_language_count:\r\n name: lines_by_language_count\r\n primitive: Dict[str, Dict[str, int]]\r\n no_git_branch_given:\r\n name: no_git_branch_given\r\n primitive: boolean\r\n quarter:\r\n name: quarter\r\n primitive: int\r\n quarter_start_date:\r\n name: quarter_start_date\r\n primitive: int\r\n quarters:\r\n name: quarters\r\n primitive: int\r\n release_within_period:\r\n name: release_within_period\r\n primitive: bool\r\n str:\r\n name: str\r\n primitive: str\r\n valid_git_repository_URL:\r\n name: valid_git_repository_URL\r\n primitive: boolean\r\n work_spread:\r\n name: work_spread\r\n primitive: int\r\nflow:\r\n alice.shouldi.contribute.cicd:cicd_action_library:\r\n inputs:\r\n action_file_paths:\r\n - dffml_operations_innersource.operations:action_yml_files: result\r\n alice.shouldi.contribute.cicd:cicd_jenkins_library:\r\n inputs:\r\n groovy_file_paths:\r\n - dffml_operations_innersource.operations:groovy_files: result\r\n alice.shouldi.contribute.cicd:cicd_library:\r\n inputs:\r\n cicd_action_library:\r\n - alice.shouldi.contribute.cicd:cicd_action_library: result\r\n cicd_jenkins_library:\r\n - alice.shouldi.contribute.cicd:cicd_jenkins_library: result\r\n check_if_valid_git_repository_URL:\r\n inputs:\r\n URL:\r\n - dffml_operations_innersource.cli:github_repo_id_to_clone_url: result\r\n - seed\r\n cleanup_git_repo:\r\n inputs:\r\n repo:\r\n - clone_git_repo: repo\r\n clone_git_repo:\r\n conditions:\r\n - check_if_valid_git_repository_URL: valid\r\n inputs:\r\n URL:\r\n - dffml_operations_innersource.cli:github_repo_id_to_clone_url: result\r\n - seed\r\n ssh_key:\r\n - seed\r\n count_authors:\r\n inputs:\r\n author_lines:\r\n - git_repo_author_lines_for_dates: author_lines\r\n dffml_feature_git.feature.operations:git_grep:\r\n inputs:\r\n repo:\r\n - clone_git_repo: repo\r\n search:\r\n - seed\r\n dffml_operations_innersource.cli:ensure_tokei:\r\n inputs: {}\r\n dffml_operations_innersource.cli:github_repo_id_to_clone_url:\r\n inputs:\r\n repo_id:\r\n - seed\r\n dffml_operations_innersource.operations:action_yml_files:\r\n inputs:\r\n repo:\r\n - git_repo_checkout: repo\r\n dffml_operations_innersource.operations:badge_maintained:\r\n conditions:\r\n - dffml_operations_innersource.operations:maintained: result\r\n - dffml_operations_innersource.operations:unmaintained: result\r\n inputs: {}\r\n dffml_operations_innersource.operations:badge_unmaintained:\r\n conditions:\r\n - dffml_operations_innersource.operations:maintained: result\r\n - dffml_operations_innersource.operations:unmaintained: result\r\n inputs: {}\r\n dffml_operations_innersource.operations:code_of_conduct_present:\r\n inputs:\r\n repo:\r\n - git_repo_checkout: repo\r\n dffml_operations_innersource.operations:contributing_present:\r\n inputs:\r\n repo:\r\n - git_repo_checkout: repo\r\n dffml_operations_innersource.operations:get_current_datetime_as_git_date:\r\n inputs: {}\r\n dffml_operations_innersource.operations:github_workflows:\r\n inputs:\r\n repo:\r\n - git_repo_checkout: repo\r\n dffml_operations_innersource.operations:groovy_files:\r\n inputs:\r\n repo:\r\n - git_repo_checkout: repo\r\n dffml_operations_innersource.operations:jenkinsfiles:\r\n inputs:\r\n repo:\r\n - git_repo_checkout: repo\r\n dffml_operations_innersource.operations:maintained:\r\n inputs:\r\n results:\r\n - group_by: output\r\n dffml_operations_innersource.operations:readme_present:\r\n inputs:\r\n repo:\r\n - git_repo_checkout: repo\r\n dffml_operations_innersource.operations:security_present:\r\n inputs:\r\n repo:\r\n - git_repo_checkout: repo\r\n dffml_operations_innersource.operations:support_present:\r\n inputs:\r\n repo:\r\n - git_repo_checkout: repo\r\n dffml_operations_innersource.operations:unmaintained:\r\n inputs:\r\n results:\r\n - group_by: output\r\n git_commits:\r\n inputs:\r\n branch:\r\n - git_repo_default_branch: branch\r\n repo:\r\n - clone_git_repo: repo\r\n start_end:\r\n - quarters_back_to_date: start_end\r\n git_repo_author_lines_for_dates:\r\n inputs:\r\n branch:\r\n - git_repo_default_branch: branch\r\n repo:\r\n - clone_git_repo: repo\r\n start_end:\r\n - quarters_back_to_date: start_end\r\n git_repo_checkout:\r\n inputs:\r\n commit:\r\n - git_repo_commit_from_date: commit\r\n repo:\r\n - clone_git_repo: repo\r\n git_repo_commit_from_date:\r\n inputs:\r\n branch:\r\n - git_repo_default_branch: branch\r\n date:\r\n - quarters_back_to_date: date\r\n repo:\r\n - clone_git_repo: repo\r\n git_repo_default_branch:\r\n conditions:\r\n - seed\r\n inputs:\r\n repo:\r\n - clone_git_repo: repo\r\n git_repo_release:\r\n inputs:\r\n branch:\r\n - git_repo_default_branch: branch\r\n repo:\r\n - clone_git_repo: repo\r\n start_end:\r\n - quarters_back_to_date: start_end\r\n group_by:\r\n inputs:\r\n spec:\r\n - seed\r\n lines_of_code_by_language:\r\n conditions:\r\n - dffml_operations_innersource.operations:badge_maintained: result\r\n - dffml_operations_innersource.operations:badge_unmaintained: result\r\n - dffml_operations_innersource.cli:ensure_tokei: result\r\n inputs:\r\n repo:\r\n - git_repo_checkout: repo\r\n lines_of_code_to_comments:\r\n inputs:\r\n langs:\r\n - lines_of_code_by_language: lines_by_language\r\n make_quarters:\r\n inputs:\r\n number:\r\n - seed\r\n quarters_back_to_date:\r\n inputs:\r\n date:\r\n - dffml_operations_innersource.operations:get_current_datetime_as_git_date: result\r\n number:\r\n - make_quarters: quarters\r\n work:\r\n inputs:\r\n author_lines:\r\n - git_repo_author_lines_for_dates: author_lines\r\nlinked: true\r\noperations:\r\n alice.shouldi.contribute.cicd:cicd_action_library:\r\n inputs:\r\n action_file_paths: ActionYAMLFileWorkflowUnixStylePath\r\n name: alice.shouldi.contribute.cicd:cicd_action_library\r\n outputs:\r\n result: IsCICDGitHubActionsLibrary\r\n retry: 0\r\n stage: output\r\n alice.shouldi.contribute.cicd:cicd_jenkins_library:\r\n inputs:\r\n groovy_file_paths: GroovyFileWorkflowUnixStylePath\r\n name: alice.shouldi.contribute.cicd:cicd_jenkins_library\r\n outputs:\r\n result: IsCICDJenkinsLibrary\r\n retry: 0\r\n stage: output\r\n alice.shouldi.contribute.cicd:cicd_library:\r\n inputs:\r\n cicd_action_library: IsCICDGitHubActionsLibrary\r\n cicd_jenkins_library: IsCICDJenkinsLibrary\r\n name: alice.shouldi.contribute.cicd:cicd_library\r\n outputs:\r\n result: CICDLibrary\r\n retry: 0\r\n stage: output\r\n check_if_valid_git_repository_URL:\r\n inputs:\r\n URL: URL\r\n name: check_if_valid_git_repository_URL\r\n outputs:\r\n valid: valid_git_repository_URL\r\n retry: 0\r\n stage: processing\r\n cleanup_git_repo:\r\n inputs:\r\n repo: git_repository\r\n name: cleanup_git_repo\r\n outputs: {}\r\n retry: 0\r\n stage: cleanup\r\n clone_git_repo:\r\n conditions:\r\n - valid_git_repository_URL\r\n inputs:\r\n URL: URL\r\n ssh_key: git_repo_ssh_key\r\n name: clone_git_repo\r\n outputs:\r\n repo: git_repository\r\n retry: 0\r\n stage: processing\r\n count_authors:\r\n inputs:\r\n author_lines: author_line_count\r\n name: count_authors\r\n outputs:\r\n authors: author_count\r\n retry: 0\r\n stage: processing\r\n dffml_feature_git.feature.operations:git_grep:\r\n inputs:\r\n repo: git_repository\r\n search: git_grep_search\r\n name: dffml_feature_git.feature.operations:git_grep\r\n outputs:\r\n found: git_grep_found\r\n retry: 0\r\n stage: processing\r\n dffml_operations_innersource.cli:ensure_tokei:\r\n inputs: {}\r\n name: dffml_operations_innersource.cli:ensure_tokei\r\n outputs:\r\n result: str\r\n retry: 0\r\n stage: processing\r\n dffml_operations_innersource.cli:github_repo_id_to_clone_url:\r\n inputs:\r\n repo_id: GitHubRepoID\r\n name: dffml_operations_innersource.cli:github_repo_id_to_clone_url\r\n outputs:\r\n result: URL\r\n retry: 0\r\n stage: processing\r\n dffml_operations_innersource.operations:action_yml_files:\r\n expand:\r\n - result\r\n inputs:\r\n repo: git_repository_checked_out\r\n name: dffml_operations_innersource.operations:action_yml_files\r\n outputs:\r\n result: ActionYAMLFileWorkflowUnixStylePath\r\n retry: 0\r\n stage: processing\r\n dffml_operations_innersource.operations:badge_maintained:\r\n conditions:\r\n - bool\r\n inputs: {}\r\n name: dffml_operations_innersource.operations:badge_maintained\r\n outputs:\r\n result: str\r\n retry: 0\r\n stage: output\r\n dffml_operations_innersource.operations:badge_unmaintained:\r\n conditions:\r\n - bool\r\n inputs: {}\r\n name: dffml_operations_innersource.operations:badge_unmaintained\r\n outputs:\r\n result: str\r\n retry: 0\r\n stage: output\r\n dffml_operations_innersource.operations:code_of_conduct_present:\r\n inputs:\r\n repo: git_repository_checked_out\r\n name: dffml_operations_innersource.operations:code_of_conduct_present\r\n outputs:\r\n result: FileCodeOfConductPresent\r\n retry: 0\r\n stage: processing\r\n dffml_operations_innersource.operations:contributing_present:\r\n inputs:\r\n repo: git_repository_checked_out\r\n name: dffml_operations_innersource.operations:contributing_present\r\n outputs:\r\n result: FileContributingPresent\r\n retry: 0\r\n stage: processing\r\n dffml_operations_innersource.operations:get_current_datetime_as_git_date:\r\n inputs: {}\r\n name: dffml_operations_innersource.operations:get_current_datetime_as_git_date\r\n outputs:\r\n result: quarter_start_date\r\n retry: 0\r\n stage: processing\r\n dffml_operations_innersource.operations:github_workflows:\r\n expand:\r\n - result\r\n inputs:\r\n repo: git_repository_checked_out\r\n name: dffml_operations_innersource.operations:github_workflows\r\n outputs:\r\n result: GitHubActionsWorkflowUnixStylePath\r\n retry: 0\r\n stage: processing\r\n dffml_operations_innersource.operations:groovy_files:\r\n expand:\r\n - result\r\n inputs:\r\n repo: git_repository_checked_out\r\n name: dffml_operations_innersource.operations:groovy_files\r\n outputs:\r\n result: GroovyFileWorkflowUnixStylePath\r\n retry: 0\r\n stage: processing\r\n dffml_operations_innersource.operations:jenkinsfiles:\r\n expand:\r\n - result\r\n inputs:\r\n repo: git_repository_checked_out\r\n name: dffml_operations_innersource.operations:jenkinsfiles\r\n outputs:\r\n result: JenkinsfileWorkflowUnixStylePath\r\n retry: 0\r\n stage: processing\r\n dffml_operations_innersource.operations:maintained:\r\n inputs:\r\n results: group_by_output\r\n name: dffml_operations_innersource.operations:maintained\r\n outputs:\r\n result: bool\r\n retry: 0\r\n stage: output\r\n dffml_operations_innersource.operations:readme_present:\r\n inputs:\r\n repo: git_repository_checked_out\r\n name: dffml_operations_innersource.operations:readme_present\r\n outputs:\r\n result: FileReadmePresent\r\n retry: 0\r\n stage: processing\r\n dffml_operations_innersource.operations:security_present:\r\n inputs:\r\n repo: git_repository_checked_out\r\n name: dffml_operations_innersource.operations:security_present\r\n outputs:\r\n result: FileSecurityPresent\r\n retry: 0\r\n stage: processing\r\n dffml_operations_innersource.operations:support_present:\r\n inputs:\r\n repo: git_repository_checked_out\r\n name: dffml_operations_innersource.operations:support_present\r\n outputs:\r\n result: FileSupportPresent\r\n retry: 0\r\n stage: processing\r\n dffml_operations_innersource.operations:unmaintained:\r\n inputs:\r\n results: group_by_output\r\n name: dffml_operations_innersource.operations:unmaintained\r\n outputs:\r\n result: bool\r\n retry: 0\r\n stage: output\r\n git_commits:\r\n inputs:\r\n branch: git_branch\r\n repo: git_repository\r\n start_end: date_pair\r\n name: git_commits\r\n outputs:\r\n commits: commit_count\r\n retry: 0\r\n stage: processing\r\n git_repo_author_lines_for_dates:\r\n inputs:\r\n branch: git_branch\r\n repo: git_repository\r\n start_end: date_pair\r\n name: git_repo_author_lines_for_dates\r\n outputs:\r\n author_lines: author_line_count\r\n retry: 0\r\n stage: processing\r\n git_repo_checkout:\r\n inputs:\r\n commit: git_commit\r\n repo: git_repository\r\n name: git_repo_checkout\r\n outputs:\r\n repo: git_repository_checked_out\r\n retry: 0\r\n stage: processing\r\n git_repo_commit_from_date:\r\n inputs:\r\n branch: git_branch\r\n date: date\r\n repo: git_repository\r\n name: git_repo_commit_from_date\r\n outputs:\r\n commit: git_commit\r\n retry: 0\r\n stage: processing\r\n git_repo_default_branch:\r\n conditions:\r\n - no_git_branch_given\r\n inputs:\r\n repo: git_repository\r\n name: git_repo_default_branch\r\n outputs:\r\n branch: git_branch\r\n remote: git_remote\r\n retry: 0\r\n stage: processing\r\n git_repo_release:\r\n inputs:\r\n branch: git_branch\r\n repo: git_repository\r\n start_end: date_pair\r\n name: git_repo_release\r\n outputs:\r\n present: release_within_period\r\n retry: 0\r\n stage: processing\r\n group_by:\r\n inputs:\r\n spec: group_by_spec\r\n name: group_by\r\n outputs:\r\n output: group_by_output\r\n retry: 0\r\n stage: output\r\n lines_of_code_by_language:\r\n conditions:\r\n - str\r\n inputs:\r\n repo: git_repository_checked_out\r\n name: lines_of_code_by_language\r\n outputs:\r\n lines_by_language: lines_by_language_count\r\n retry: 0\r\n stage: processing\r\n lines_of_code_to_comments:\r\n inputs:\r\n langs: lines_by_language_count\r\n name: lines_of_code_to_comments\r\n outputs:\r\n code_to_comment_ratio: language_to_comment_ratio\r\n retry: 0\r\n stage: processing\r\n make_quarters:\r\n expand:\r\n - quarters\r\n inputs:\r\n number: quarters\r\n name: make_quarters\r\n outputs:\r\n quarters: quarter\r\n retry: 0\r\n stage: processing\r\n quarters_back_to_date:\r\n expand:\r\n - date\r\n - start_end\r\n inputs:\r\n date: quarter_start_date\r\n number: quarter\r\n name: quarters_back_to_date\r\n outputs:\r\n date: date\r\n start_end: date_pair\r\n retry: 0\r\n stage: processing\r\n work:\r\n inputs:\r\n author_lines: author_line_count\r\n name: work\r\n outputs:\r\n work: work_spread\r\n retry: 0\r\n stage: processing\r\nseed:\r\n- definition: quarters\r\n origin: seed\r\n value: 10\r\n- definition: no_git_branch_given\r\n origin: seed\r\n value: true\r\n- definition: group_by_spec\r\n origin: seed\r\n value:\r\n ActionYAMLFileWorkflowUnixStylePath:\r\n by: quarter\r\n group: ActionYAMLFileWorkflowUnixStylePath\r\n nostrict: true\r\n FileCodeOfConductPresent:\r\n by: quarter\r\n group: FileCodeOfConductPresent\r\n nostrict: true\r\n FileContributingPresent:\r\n by: quarter\r\n group: FileContributingPresent\r\n nostrict: true\r\n FileReadmePresent:\r\n by: quarter\r\n group: FileReadmePresent\r\n nostrict: true\r\n FileSecurityPresent:\r\n by: quarter\r\n group: FileSecurityPresent\r\n nostrict: true\r\n FileSupportPresent:\r\n by: quarter\r\n group: FileSupportPresent\r\n nostrict: true\r\n GitHubActionsWorkflowUnixStylePath:\r\n by: quarter\r\n group: GitHubActionsWorkflowUnixStylePath\r\n nostrict: true\r\n GroovyFileWorkflowUnixStylePath:\r\n by: quarter\r\n group: GroovyFileWorkflowUnixStylePath\r\n nostrict: true\r\n JenkinsfileWorkflowUnixStylePath:\r\n by: quarter\r\n group: JenkinsfileWorkflowUnixStylePath\r\n nostrict: true\r\n author_line_count:\r\n by: quarter\r\n group: author_line_count\r\n nostrict: true\r\n commit_shas:\r\n by: quarter\r\n group: git_commit\r\n nostrict: true\r\n release_within_period:\r\n by: quarter\r\n group: release_within_period\r\n nostrict: true\r\n\r\n```\r\n\r\n</details>"
}
]
},
{
"body": "# 2022-11-15 Engineering Logs\r\n\r\n- Exemplary docs\r\n - https://cve-bin-tool.readthedocs.io/en/latest/CONTRIBUTING.html#running-tests",
"replies": [
{
"body": " ## 2022-11-15 @pdxjohnny Engineering Logs\r\n\r\n- https://docs.joinmastodon.org/spec/activitypub/\r\n- https://docs.joinmastodon.org/dev/setup/\r\n - > In the development environment, Mastodon will use PostgreSQL as the currently signed-in Linux user using the `ident` method, which usually works out of the box. The one command you need to run is rails `db:setup` which will create the databases `mastodon_development` and `mastodon_test`, load the schema into them, and then create seed data defined in `db/seed.rb` in `mastodon_development`. The only seed data is an admin account with the credentials `admin@localhost:3000` / `mastodonadmin`.\r\n - We'll change the `.env.production` user to match\r\n- https://github.com/felx/mastodon-documentation/blob/master/Running-Mastodon/Docker-Guide.md\r\n\r\n**.env.production**\r\n\r\n```bash\r\n# Generated with mastodon:setup on 2022-11-15 14:37:27 UTC\r\n\r\n# Some variables in this file will be interpreted differently whether you are\r\n# using docker-compose or not.\r\n\r\nLOCAL_DOMAIN=localhost\r\nSINGLE_USER_MODE=false\r\nSECRET_KEY_BASE=1c60ddccf21afd66e355a85621767feb1ffe47d1b9ac9e8bab5ef283a0fa6c1cc9e7015409bb645551ef7ab4b9f09aed90069640e91500f0009887509d2e1f4f\r\nOTP_SECRET=376e8655790cc05d973d6d427e1e37f98cee9ebc91f6c33eda6243b650fd8f8531a34a43d4c0d62940db6064ea8bdce581d11ff7a22e4ec81f7ffedaad0ad26f\r\nVAPID_PRIVATE_KEY=M7FtL40N4rJ2BtbtyWFHN9b1jaWD4x8p2Pab-FGGb3M=\r\nVAPID_PUBLIC_KEY=BP_BPQEpiSuv0Qri0XWSr54MC0ug5hHb905PPRLufPhu13QCF3D86cW3ReFnZ411VoDB5lDfuntBmYU0Ku65oVs=\r\nDB_HOST=db\r\nDB_PORT=5432\r\nDB_NAME=mastodon_development\r\nDB_USER=admin\r\nDB_PASS=mastodonadmin\r\nREDIS_HOST=redis\r\nREDIS_PORT=6379\r\nREDIS_PASSWORD=\r\nSMTP_SERVER=localhost\r\nSMTP_PORT=25\r\nSMTP_AUTH_METHOD=none\r\nSMTP_OPENSSL_VERIFY_MODE=none\r\nSMTP_ENABLE_STARTTLS=auto\r\nSMTP_FROM_ADDRESS=Mastodon <notifications@localhost>\r\n```\r\n\r\n```console\r\n$ grep POSTGRES_ docker-compose.yml\r\n - 'POSTGRES_DB=mastodon_development'\r\n - 'POSTGRES_USER=admin'\r\n - 'POSTGRES_PASSWORD=mastodonadmin'\r\n$ time podman-compose run -e DISABLE_DATABASE_ENVIRONMENT_CHECK=1 web rails db:setup\r\n... <copy paste config into .env.production> ...\r\n$ time podman-compose run web bundle exec rake db:migrate\r\n$ podman-compose up\r\n$ curl -H \"Host: https://localhost:3000/\" -v http://localhost:3000/\r\n* Trying 127.0.0.1:3000...\r\n* Connected to localhost (127.0.0.1) port 3000 (#0)\r\n> GET / HTTP/1.1\r\n> Host: https://localhost:3000/\r\n> User-Agent: curl/7.85.0\r\n> Accept: */*\r\n> \r\n* Mark bundle as not supporting multiuse\r\n< HTTP/1.1 403 Forbidden\r\n< Content-Type: text/html; charset=UTF-8\r\n< Content-Length: 0\r\n< \r\n* Connection #0 to host localhost left intact\r\n```\r\n\r\n```console\r\n$ podman-compose run web -e RAILS_ENV=production bin/tootctl accounts modify alice --role Owner\r\n$ podman-compose run web -e RAILS_ENV=production bin/tootctl accounts create \\\r\n alice \\\r\n --email alice@chadig.com \\\r\n --confirmed \\\r\n --role Owner\r\n```\r\n\r\n- Okay giving up on Mastodon spin up, RSS feeds (+websub) probably best for SBOM and VEX\r\n streams anyway.\r\n- References\r\n - https://github.com/BasixKOR/awesome-activitypub\r\n - https://github.com/dariusk/rss-to-activitypub\r\n - https://www.w3schools.com/xml/xml_rss.asp\r\n - https://github.com/chainfeeds/RSSAggregatorforWeb3\r\n - Here's a possible basis for our web2 -> web3/5\r\n - https://github.com/RoCry/feedcooker/releases/tag/latest\r\n - https://github.com/RoCry/feedcooker/releases/download/latest/Rust_News.xml\r\n - https://github.com/RoCry/feedcooker/issues/1\r\n - This is a nice aggregator we could use in the future\r\n - https://github.com/actionsflow/actionsflow-workflow-default\r\n - GitHub Actions workflows can trigger from RSS feeds via this third party framework\r\n not clear if it pools or not. websub and publish / serialize / configloader for\r\n `dffml dataflow run records set` output as RSS feed?\r\n - https://actionsflow.github.io/\r\n - https://mastodon.social/@pdxjohnny.rss\r\n - Example posted below\r\n - https://twit.social/@jr/109348004478960008\r\n - https://twit.social/tags/android.rss\r\n - Very cool Mastodon will serve RSS feeds for tags.\r\n - This would allow us to reply to tweets with given tags\r\n and then automated determine provenance (see deepfake detection),\r\n and reply with estimated provenance via SBOM / VEX with SCITT\r\n recpits encoded into (didme.me) image in response (or if\r\n we can put the CBOR in a JWK claim maybe that would serialize\r\n to a stupidly long string, then encode that to an image?)\r\n - https://mastodon.social/tags/scitt.rss\r\n - It would be nice if there was a multi-tag URL.\r\n - Example: https://mastodon.social/tags/alice,scitt,vex.rss\r\n - Example: https://mastodon.social/tags/scitt,vex.rss\r\n - Example: https://mastodon.social/tags/scitt,sbom.rss\r\n\r\n```xml\r\n<?xml version=\"1.0\" encoding=\"UTF-8\"?>\r\n<rss version=\"2.0\" xmlns:webfeeds=\"http://webfeeds.org/rss/1.0\" xmlns:media=\"http://search.yahoo.com/mrss/\">\r\n <channel>\r\n <title>John</title>\r\n <description>Public posts from @pdxjohnny@mastodon.social</description>\r\n <link>https://mastodon.social/@pdxjohnny</link>\r\n \r\n <lastBuildDate>Tue, 15 Nov 2022 16:18:15 +0000</lastBuildDate>\r\n <webfeeds:icon>https://files.mastodon.social/accounts/avatars/000/032/591/original/9c6c698d572049b4.jpeg</webfeeds:icon>\r\n <generator>Mastodon v4.0.2</generator>\r\n <item>\r\n <guid isPermaLink=\"true\">https://mastodon.social/@pdxjohnny/109348722777644811</guid>\r\n <link>https://mastodon.social/@pdxjohnny/109348722777644811</link>\r\n <pubDate>Tue, 15 Nov 2022 16:18:15 +0000</pubDate>\r\n <description>&lt;p&gt;RSS VEX feeds?&lt;/p&gt;&lt;p&gt;&lt;a href=\"https://twit.social/@jr/109345573865828477\" target=\"_blank\" rel=\"nofollow noopener noreferrer\"&gt;&lt;span class=\"invisible\"&gt;https://&lt;/span&gt;&lt;span class=\"ellipsis\"&gt;twit.social/@jr/10934557386582&lt;/span&gt;&lt;span class=\"invisible\"&gt;8477&lt;/span&gt;&lt;/a&gt;&lt;/p&gt;&lt;p&gt;2022-11-15 Engineering Logs: &lt;a href=\"https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4146655\" target=\"_blank\" rel=\"nofollow noopener noreferrer\"&gt;&lt;span class=\"invisible\"&gt;https://&lt;/span&gt;&lt;span class=\"ellipsis\"&gt;github.com/intel/dffml/discuss&lt;/span&gt;&lt;span class=\"invisible\"&gt;ions/1406?sort=new#discussioncomment-4146655&lt;/span&gt;&lt;/a&gt;&lt;/p&gt;</description>\r\n </item>\r\n <item>\r\n <guid isPermaLink=\"true\">https://mastodon.social/@pdxjohnny/109320563491316354</guid>\r\n <link>https://mastodon.social/@pdxjohnny/109320563491316354</link>\r\n <pubDate>Thu, 10 Nov 2022 16:56:58 +0000</pubDate>\r\n <description>&lt;p&gt;The Alice thread continues!&lt;/p&gt;&lt;p&gt;We take one step further towards decentralization as we federate our way away from Twitter.&lt;/p&gt;&lt;p&gt;Today we&amp;#39;re playing with SCITT and ATProto: &lt;a href=\"https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4104302\" target=\"_blank\" rel=\"nofollow noopener noreferrer\"&gt;&lt;span class=\"invisible\"&gt;https://&lt;/span&gt;&lt;span class=\"ellipsis\"&gt;github.com/intel/dffml/discuss&lt;/span&gt;&lt;span class=\"invisible\"&gt;ions/1406?sort=new#discussioncomment-4104302&lt;/span&gt;&lt;/a&gt;&lt;/p&gt;&lt;p&gt;Prev: &lt;a href=\"https://twitter.com/pdxjohnny/status/1585488415864557568\" target=\"_blank\" rel=\"nofollow noopener noreferrer\"&gt;&lt;span class=\"invisible\"&gt;https://&lt;/span&gt;&lt;span class=\"ellipsis\"&gt;twitter.com/pdxjohnny/status/1&lt;/span&gt;&lt;span class=\"invisible\"&gt;585488415864557568&lt;/span&gt;&lt;/a&gt;&lt;/p&gt;</description>\r\n </item>\r\n </channel>\r\n</rss>\r\n```\r\n\r\n- We could also httptest NIST API\r\n - https://github.com/intel/cve-bin-tool/issues/2334\r\n - Looks like 7 days ago cve-bin-tool community themselves (Terri in this case :) highlighed a similar need!\r\n - Trying to run tests\r\n - Need `NVD_API_KEY`\r\n - Request via email activation flow https://nvd.nist.gov/developers/request-an-api-key\r\n - Link in email to activation page (10 minute email websub rss? -> ATP)\r\n - Grab UUID which is token off page\r\n\r\n```console\r\n$ nvd_api_key=$NVD_API_KEY LONG_TESTS=1 python -um pytest -v --log-level=DEBUG --log-cli-level=DEBUG test/test_nvd_api.py 2>&1 | gh gist create -p -d 'Failure to launch NVD API tests: https://github.com/intel/cve-bin-tool/issues/2334'\r\n```\r\n\r\n- Output of above command: https://gist.github.com/pdxjohnny/dcfaecadd743e773c8aed3e1d323e0bd\r\n - `$ REC_TITLE=\"httptest NIST API: 2022-11-15 @pdxjohnny Engineering Logs: https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4146655\" exec bash`\r\n - https://github.com/pdxjohnny/dotfiles/blob/ccccfe8f55729bab6f00573a0b3c0358a3a77cf9/.asciinema_source\r\n - `$ unxz -d < ~/asciinema/fedora-rec-2022-11-15T10:05:02-08:00.json.xz | python -m asciinema upload /dev/stdin`\r\n - `$ unxz -d < $(ls ~/asciinema/$(hostname)-rec-* | tail -n 1) | python -m asciinema upload /dev/stdin`\r\n\r\n[![asciicast-of-failure-to-run-test_nvd_api](https://asciinema.org/a/537871.svg)](https://asciinema.org/a/537871)\r\n\r\n[![asciicast](https://asciinema.org/a/537888.svg)](https://asciinema.org/a/537888)\r\n\r\n- Got the NVD tests parameterized to versions 1 and 2.\r\n\r\n```diff\r\ndiff --git a/cve_bin_tool/nvd_api.py b/cve_bin_tool/nvd_api.py\r\nindex 6245c56..d151cd1 100644\r\n--- a/cve_bin_tool/nvd_api.py\r\n+++ b/cve_bin_tool/nvd_api.py\r\n@@ -139,7 +139,7 @@ class NVD_API:\r\n \r\n if self.invalid_api:\r\n self.logger.warning(\r\n- f'Unable to access NVD using provided API key: {self.params[\"apiKey\"]}'\r\n+ f'Unable to access NVD using provided API key: {self.params.get(\"apiKey\", \"NO_API_KEY_GIVEN\")}'\r\n )\r\n else:\r\n if time_of_last_update:\r\ndiff --git a/test/test_nvd_api.py b/test/test_nvd_api.py\r\nindex 29f14e9..109815c 100644\r\n--- a/test/test_nvd_api.py\r\n+++ b/test/test_nvd_api.py\r\n@@ -8,6 +8,7 @@ from datetime import datetime, timedelta\r\n from test.utils import LONG_TESTS\r\n \r\n import pytest\r\n+import aiohttp\r\n \r\n from cve_bin_tool.cvedb import CVEDB\r\n from cve_bin_tool.data_sources import nvd_source\r\n@@ -42,14 +43,24 @@ class TestNVD_API:\r\n LONG_TESTS() != 1 or not os.getenv(\"nvd_api_key\"),\r\n reason=\"NVD tests run only in long tests\",\r\n )\r\n- async def test_total_results_count(self):\r\n+ @pytest.mark.parametrize(\r\n+ \"api_version, feed\",\r\n+ [\r\n+ (\"1.0\", None),\r\n+ (\"2.0\", None),\r\n+ ],\r\n+ )\r\n+ async def test_total_results_count(self, api_version, feed):\r\n \"\"\"Total results should be greater than or equal to the current fetched cves\"\"\"\r\n- nvd_api = NVD_API(api_key=os.getenv(\"nvd_api_key\") or \"\")\r\n- await nvd_api.get_nvd_params(\r\n- time_of_last_update=datetime.now() - timedelta(days=2)\r\n- )\r\n- await nvd_api.get()\r\n- assert len(nvd_api.all_cve_entries) >= nvd_api.total_results\r\n+ async with aiohttp.ClientSession() as session:\r\n+ nvd_api = NVD_API(api_key=os.getenv(\"nvd_api_key\") or \"\",\r\n+ session=session)\r\n+ nvd_api.logger.info(\"api_version: %s, feed: %s\", api_version, feed)\r\n+ await nvd_api.get_nvd_params(\r\n+ time_of_last_update=datetime.now() - timedelta(days=2)\r\n+ )\r\n+ await nvd_api.get()\r\n+ assert len(nvd_api.all_cve_entries) >= nvd_api.total_results\r\n \r\n @pytest.mark.asyncio\r\n @pytest.mark.skipif(\r\n```\r\n\r\n[![asciicast](https://asciinema.org/a/537921.svg)](https://asciinema.org/a/537921)\r\n\r\n[![asciicast](https://asciinema.org/a/537925.svg)](https://asciinema.org/a/537925)\r\n\r\n[![asciicast-stash-p](https://asciinema.org/a/537931.svg)](https://asciinema.org/a/537931)\r\n\r\n- Reverse engineering NIST API by dumping request response\r\n\r\n[![asciicast](https://asciinema.org/a/537936.svg)](https://asciinema.org/a/537936)\r\n\r\n```console\r\n$ gh gist create -p -d 'intel/cve-bin-tool: tests: add tests for NVD 2.0 API: https://github.com/intel/cve-bin-tool/issues/2334#issuecomment-1315643093: https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4146655' /tmp/feed-f232077c4b0644a8f77acb0c63c3d30bb59eff3be774e3e37d00c7b15cfe95079d8d80b48fede725a2f0f19cba0c9496-params.json /tmp/feed-f232077c4b0644a8f77acb0c63c3d30bb59eff3be774e3e37d00c7b15cfe95079d8d80b48fede725a2f0f19cba0c9496.json /tmp/stats.json /tmp/feed-e459d6f8805bad4c8f3097dd5071732478d08e2a6ad50c734199bc24983f49c2d1567ea11bbf2993de662af4736113c4-params.json /tmp/feed-e459d6f8805bad4c8f3097dd5071732478d08e2a6ad50c734199bc24983f49c2d1567ea11bbf2993de662af4736113c4.json /tmp/validate-283492d554c095740c199f739dd4944bfab86a6db800993e16494209c1420061fe7c0e174570715ff7bd9132d26e9b47*\r\n```\r\n\r\n- Dumped request response format: https://gist.github.com/pdxjohnny/599b453dffc799f1c4dd8d8024b0f60e\r\n- Started on https://github.com/pdxjohnny/httptest server\r\n\r\n[![asciicast](https://asciinema.org/a/537938.svg)](https://asciinema.org/a/537938)\r\n\r\n- TODO\r\n - [ ] ~~Spin up Mastodon~~\r\n - [ ] Investigate https://docs.joinmastodon.org/spec/webfinger/#example\r\n - [ ] NIST vuln feed as VEX/VDR API via httptest then integrate as additional vuln feed to cve-bin-tool then publish to via another project (pytss) then to rss then rss-to-activitypub and then see if that integrates with Mastodon then rss to web3/5\r\n - If we can get something federated working then Alice can send SBOM and VEX updates\r\n - https://github.com/intel/cve-bin-tool/pull/1698\r\n- Future\r\n - [ ] Reuse ephemeral ssh server spun up across data flows running on different hosts\r\n - [ ] Document asciicast-stash-p https://asciinema.org/a/537931 as refactoring method\r\n - [ ] Multi context logging (multiple Sources? in output query / method / data flow as class?)\r\n - Examples\r\n - Posting updates on status of CVE Bin Tool VEX via NVD API style feed\r\n as well as https://github.com/intel/cve-bin-tool/issues/2334#issuecomment-1315643093"
}
]
},
{
"body": "# 2022-11-16 Engineering Logs",
"replies": [
{
"body": "## 2022-11-16 @pdxjohnny Engineering Logs\r\n\r\n- NVD API style as first way to distribute VEX.\r\n - ActivityPub publish as well\r\n - Websub for new notifications? Look up how Mastodon does.\r\n- Working on cve-bin-tool https://github.com/intel/cve-bin-tool/issues/2334#issuecomment-1315643093\r\n - We're reverse engineering the NIST NVD API to serve VEX.\r\n - The following logs/recordings can be useful in learning how to reverse\r\n engineer an HTTP based protocol to implement a similar server.\r\n - This becomes the base layer for communication in our decentralized CI/CD\r\n aka DFFML plugin land, aka poly repo land, aka the real world, aka Wonderland.\r\n - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice/0000_architecting_alice#what-is-alice\r\n - [service: sw: src: change: notify: Service to facilitate poly repo pull model dev tooling #1315](https://github.com/intel/dffml/issues/1315#issuecomment-1066814280)\r\n - Vuln management is a MUST implement channel we can use for patch submission\r\n and comms for alignment between entities.\r\n - We're hitting this open issue while were at it.\r\n- Got basic stats response saved from cache working\r\n - Cache: https://gist.github.com/pdxjohnny/599b453dffc799f1c4dd8d8024b0f60e\r\n - Got serving feed working with same page requested over fails test (as it should, paging broken currently, next is fix that).\r\n - [gist: Python example pagination client and server](https://gist.github.com/pdxjohnny/47a6ddcd122a8f693ef346153708525a)\r\n- Side note: This asciinema was 12 MB uncut so I had to trim it up a bit\r\n\r\n[![asciicast](https://asciinema.org/a/538130.svg)](https://asciinema.org/a/538130)\r\n\r\n- httptest NIST API single CVE import working\r\n\r\n[![asciicast](https://asciinema.org/a/538136.svg)](https://asciinema.org/a/538136)\r\n\r\n[![asciicast](https://asciinema.org/a/538143.svg)](https://asciinema.org/a/538143)\r\n\r\n- Pagnation asciicast (too big, 12 MB decompressed)\r\n - [nvd-pagenation.json.txt](https://github.com/intel/dffml/files/10023980/nvd-pagenation.json.txt)\r\n\r\n```console\r\n$ unxz -d < $(ls ~/asciinema/fedora-rec-* | tail -n 1) | dd if=/dev/stdin of=/dev/null status=progress\r\n24117+1 records in\r\n24117+1 records out\r\n12348069 bytes (12 MB, 12 MiB) copied, 0.0500872 s, 247 MB/s\r\n```\r\n\r\n- Basic server seems to be working for v1 API\r\n- Added CLI command `alice threats vulns serve nvdstyle`\r\n - https://github.com/intel/dffml/commit/cb2c09ead795ba0046cb5911bcd6e939419058d8\r\n\r\nhttps://github.com/intel/dffml/blob/4101595a800e74f57cec5537ea2c65680135b71a/entities/alice/alice/threats/vulns/serve/nvdstyle.py#L1-L241\r\n\r\n- https://www.darkreading.com/dr-tech/cybersecurity-nutrition-labels-still-a-work-in-progress\r\n - https://www.whitehouse.gov/briefing-room/statements-releases/2022/10/20/statement-by-nsc-spokesperson-adrienne-watson-on-the-biden-harris-administrations-effort-to-secure-household-internet-enabled-devices/\r\n - > Yesterday, the White House convened leaders from the private sector, academic institutions, and the U.S. Government to advance a national cybersecurity labeling program for Internet-of-Things (IoT) devices. The Biden-Harris Administration has made it a priority to strengthen our nation\u2019s cybersecurity, and a key part of that effort is ensuring the devices that have become a commonplace in the average American household \u2013 like baby monitors or smart home appliances \u2013 are protected from cyber threats. A labeling program to secure such devices would provide American consumers with the peace of mind that the technology being brought into their homes is safe, and incentivize manufacturers to meet higher cybersecurity standards and retailers to market secure devices.\r\n >\r\n > Yesterday\u2019s dialogue focused on how to best implement a national cybersecurity labeling program, drive improved security standards for Internet-enabled devices, and generate a globally recognized label. Government and industry leaders discussed the importance of a trusted program to increase security across consumer devices that connect to the Internet by equipping devices with easily recognized labels to help consumers make more informed cybersecurity choices (e.g., an \u201cEnergyStar\u201d for cyber). These conversations build on the foundational work that has been pioneered by the private sector and the National Institute of Standards and Technology (NIST) to help build more secure Internet-connected devices. It also follows President Biden\u2019s Executive Order on Improving the Nation\u2019s Cybersecurity, which highlighted the need for improved IoT security and tasked NIST, in partnership with the Federal Trade Commission, to advance improved cybersecurity standards and standardized product labels for these devices.\r\n - Related: `$ grep DNA`\r\n- https://csrc.nist.gov/publications/detail/white-paper/2022/11/09/implementing-a-risk-based-approach-to-devsecops/final\r\n - > DevOps brings together software development and operations to shorten development cycles, allow organizations to be agile, and maintain the pace of innovation while taking advantage of cloud-native technology and practices. Industry and government have fully embraced and are rapidly implementing these practices to develop and deploy software in operational environments, often without a full understanding and consideration of security. Also, most software today relies on one or more third-party components, yet organizations often have little or no visibility into and understanding of how these components are developed, integrated, deployed, and maintained, as well as the practices used to ensure the components\u2019 security. To help improve the security of DevOps practices, the NCCoE is planning a DevSecOps project that will focus initially on developing and documenting an applied risk-based approach and recommendations for secure DevOps and software supply chain practices consistent with the Secure Software Development Framework (SSDF), Cybersecurity Supply Chain Risk Management (C-SCRM), and other NIST, government, and industry guidance. This project will apply these DevSecOps practices in proof-of-concept use case scenarios that will each be specific to a technology, programming language, and industry sector. Both closed source (proprietary) and open source technology will be used to demonstrate the use cases. This project will result in a freely available NIST Cybersecurity Practice Guide.\r\n- https://www.intel.com/content/www/us/en/newsroom/news/2022-intel-innovation-day-2-livestream-replay.html#gs.djq36o\r\n - Similar to the software labeling, with Alice we are trying to cross these streams\r\n - Datasheets for Datasets\r\n - https://arxiv.org/abs/1803.09010\r\n - > The machine learning community currently has no standardized process for documenting datasets, which can lead to severe consequences in high-stakes domains. To address this gap, we propose datasheets for datasets. In the electronics industry, every component, no matter how simple or complex, is accompanied with a datasheet that describes its operating characteristics, test results, recommended uses, and other information. By analogy, we propose that every dataset be accompanied with a datasheet that documents its motivation, composition, collection process, recommended uses, and so on. Datasheets for datasets will facilitate better communication between dataset creators and dataset consumers, and encourage the machine learning community to prioritize transparency and accountability.\r\n\r\n> Side from Andrew Ng's Intel Innovation 2022 Luminary Keynote\r\n> Source: https://www.intel.com/content/www/us/en/newsroom/news/2022-intel-innovation-day-2-livestream-replay.html#gs.iex8mr\r\n> ![image](https://user-images.githubusercontent.com/5950433/193330714-4bcceea4-4402-468f-82a9-51882939452c.png)\r\n\r\n- Possible alignment with Andrew's \"Data-Centric AI\"\r\n - is the discipline of systematically engineering the data used to build an AI system\r\n - This is what we're doing with Alice\r\n- Possible alignment with Andrew's \"The iterative process of ML development\"\r\n - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice/0000_architecting_alice#entity-analysis-trinity\r\n - Intent / Train model\r\n - Establish correlations between threat model intent and collected data / errors (telemetry or static analysis, policy, failures)\r\n - Dynamic analysis / Improve data\r\n - We tweak the code to make it do different things to see different data. The application of overlays. Think over time.\r\n - Static / Error analysis\r\n - There might be async debug initiated here but this maps pretty nicely conceptually since we'd think of this as a static process, we already have some errors to analyze if we're at this step.\r\n\r\n![Entity Analysis Trinity](https://user-images.githubusercontent.com/5950433/188203911-3586e1af-a1f6-434a-8a9a-a1795d7a7ca3.svg)\r\n\r\n- Gist for v2 API call cached: https://gist.github.com/pdxjohnny/ab1bf170dce272cecdd317eae55d1174\r\n- TODO\r\n - [ ] Clean up SCITT OpenSSF Use Case\r\n - https://github.com/pdxjohnny/use-cases/blob/openssf_metrics/openssf_metrics.md\r\n - https://mailarchive.ietf.org/arch/msg/scitt/cxRvcTEUNEhlxE_AJyspdx9y06w/\r\n - [ ] Get back to Kate\r\n - [ ] SCIIT for NVD style feed data\r\n - [ ] Patch CVE Bin Tool to support validation\r\n - See Dick Brooks's email: https://mailarchive.ietf.org/arch/msg/scitt/cxRvcTEUNEhlxE_AJyspdx9y06w/\r\n - > Ray\u2019s statement: \u201cI can't imagine that you could ask some other\r\n > entity other than the mfr that created the device\r\n > to provide the reference, and attest to it's validity.\u201d\r\n >\r\n > This is also true for software vulnerabilities. Only the software product developer has access to the source code needed to answer the question, \u201cIs my software product vulnerable to exploitation by CVE-XYZ?\u201d\r\n >\r\n > This is what a NIST VDR provides \u2013 a vulnerability disclosure report from a software owner to a customer indicating the vulnerability status of their product at the SBOM component level;\r\n > - https://energycentral.com/c/pip/what-nist-sbom-vulnerability-disclosure-report-vdr\r\n >\r\n > Software vendors provide links to attestations using a Vendor Repose File (VRF), which is yet another artifact that needs to be checked for trustworthiness:\r\n >\r\n > - https://energycentral.com/c/pip/advice-software-vendors-prepare-omb-m-22-18-requirements\r\n >\r\n > The VDR and VRF are both considered artifacts, which the author is making a statement of trustworthiness, that needs to be vetted by a trusted party, resulting in a claim that gets placed into a trusted registry becoming a \u201ctransparent claim\u201d in a SCITT registry.\r\n >\r\n > A consumer should be able to query the trustworthiness of the VDR and VRF artifacts using a SCITT Transparency Service, having nothing more than the original VDR and VRF artifacts in their possession.\r\n - SCITT is awesome because it supports this offline verification\r\n which is important for us with Alice because we will be running\r\n in parallel/concurrently across many instances of her. These will\r\n sometimes compute fully offline (offline RL?). Therefore we want to\r\n be able to check validity of data before handing off to EDEN nodes\r\n which might loose connection. This enables them to verify offline\r\n data push updated in their cache. This allows entities to act in\r\n accordance with strategic principles by validating data on entry,\r\n producing receipts offline, and then rejoining those to the other\r\n nodes receiving those input streams. They need to have these offline\r\n recpeits when they produce recepits for new input to maintain provenance\r\n chains (collecting data for inference within a flow running across multiple\r\n EDEN nodes doing active learning based on perceived trustworthyness of inputs).\r\n - [ ] Buy fully working mouse\r\n - [ ] Buy mousepad\r\n - [ ] Practice on ergonomic keyboard\r\n - [ ] gif of AOE1 install building for github.com/pdxjohnny/pdxjohnny/README.md\r\n - [ ] Communicate to Alice she MUST stop creating double issues with todos command\r\n - Fix the bug\r\n - [ ] SBOM, VEX, etc. feeds to ActivityPub, websub, RSS, web5 (ATP Data Repositories or if W3C or DIF has something)\r\n - [ ] Rebuild on trigger\r\n- Future\r\n - [ ] Auto sync asciinema recs / stream to https://github.com/asciinema/asciinema-server\r\n - [ ] Conversion to SBOM, VEX, etc. feeds\r\n - [ ] Coder demo / templates\r\n - Workspace / template as server\r\n - [ ] Pull request Atuin to not change the way the up arrow works\r\n - [ ] Respond to https://mailarchive.ietf.org/arch/msg/scitt/fg6_z2HauVl5d6mklUnMQivE57Y/\r\n and see if we can collaberate.\r\n - [ ] Auto sync Atuin https://github.com/ellie/atuin/blob/main/docs/server.md\r\n - [ ] Conversion to SBOM, VEX, etc. feeds\r\n - [ ] Coder demo / templates\r\n - Workspace / template as server"
},
{
"body": "## 2022-11-16 Portland Linux Kernel November meetup\r\n\r\n- https://www.meetup.com/portland-linux-kernel-meetup/events/289592627/\r\n- Talked to Andy most of the time (x86, kvm nested)\r\n - Asked him what he's excited about\r\n - He's stoked on profiling and perf counters, good stuff to be stoked on.\r\n - Mentioned ptrace, instruction count per cycle I think, can't quite remember.\r\n - Told him will circle back once we are to retriggering for regressions.\r\n- Semantic grep\r\n- https://www.kernel.org/doc/html/v6.0/dev-tools/coccinelle.html\r\n - Idea is to infer what the input to coccinelle is (figure out appropriate semantic patch)\r\n- Gave example of three developers working on different branches in different repos.\r\n Yes we aren't supposed to have long lived feature branches, but if you have three\r\n short lived dev branches you're still here.\r\n - Alice works in the background constantly trying to find the \"state of the art\"\r\n for the combination of those branches.\r\n - Alice is always trying to ensure you're working off the context local dynamic\r\n state of the art, LIVE at HEAD for decentralized development.\r\n - Git allows your source control to be decentralized but this allows yo\r\n to take full advantage of that, grep A/B testing rebase cherry-pick all\r\n permutations (how dataflows already call operations, grep for food / recipe\r\n example)."
}
]
},
{
"body": "# 2022-11-17 Engineering Logs",
"replies": [
{
"body": "## 2022-11-17 @pdxjohnny Engineering Logs\r\n\r\n- Verifiable Credentials\r\n - https://verite.id/verite/appendix/primer\r\n - https://github.com/uport-project/veramo\r\n - \r\n- OIDC\r\n - https://docs.github.com/en/actions/deployment/security-hardening-your-deployments/about-security-hardening-with-openid-connect#getting-started-with-oidc\r\n- docs/arch/alice/discussion/0001/reply_0007.md BJJ analogy, land in Coach Alice?\r\n- Alignment\r\n - GSoC rubric as way of grading proposed compute contract /\r\n engagement / manifest (instance) / work item / GitHub issue / work.\r\n - https://dffml.github.io/dffml-pre-image-removal/contributing/gsoc/rubric.html\r\n\r\n![dffml-gsoc-grading-rubric-table](https://user-images.githubusercontent.com/5950433/202493540-90b52a01-337a-4098-a102-021fe338372d.png)\r\n\r\nhttps://github.com/intel/dffml/blob/3530ee0d20d1062605f82d1f5055f455f8c2c68f/docs/contributing/gsoc/rubric.rst#L1-L134\r\n\r\n- This thread stopped working / loading on my phone :(\r\n - Light laptop also apparently crumbling under weight of GitHub rendered thread\r\n- Thread needs to become something VEX/SBOM/WEB3/5 soon\r\n - Very soon this is unusable. one things fixed (Linux PC) and another thing breaks\r\n the thread. Such is the life of those of Chaos.\r\n- PWA with root of trust as brave wallet?\r\n - Offline sync of data with provenance by local SCITT with root of trust to brave wallet.\r\n - See \"SCITT for NVD style feed data\" children/downstream(links)/sub-bullet points (trying to figure out most ergonomic wording, child parent is antiquated/not descriptive enough (it's a one to many when looking from bulletpoint item at ancestry, tree, knowledge graph, links) with online cloning so we need to keep thinking) [2022-11-16 @pdxjohnny Engineering Logs](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4157129)\r\n - https://github.com/pdxjohnny/use-cases/blob/openssf_metrics/openssf_metrics.md\r\n - > As a follow on to the OpenSSF Metrics use case document and [Living Threat Models are better than Dead Threat Models](https://www.youtube.com/watch?v=TMlC_iAK3Rg&list=PLtzAOVTpO2jYt71umwc-ze6OmwwCIMnLw) [Rolling Alice: Volume 1: Coach Alice: Chapter 1: Down the Dependency Rabbit-Hole Again](https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0001_coach_alice/0001_down_the_dependency_rabbit_hole_again.md) will cover how we identify and query provenance on dependencies where caching on data flow execution is assisted via quering public SCITT infrastructure and sourcing cached state from trustworthy parties.\r\n\r\n```console\r\n$ dffml service dev export -configloader json alice.cli:AlicePleaseLogTodosCLIDataFlow | tee logtodos.json && (echo '```mermaid' && dffml dataflow diagram logtodos.json && echo '```') | gh gist create -f \"LOG_TODOS_DATAFLOW_DIAGRAM.md\" -`\r\n```\r\n\r\n**alice.cli:AlicePleaseLogTodosCLIDataFlow**\r\n\r\n```mermaid\r\ngraph TD\r\nsubgraph a759a07029077edc5c37fea0326fa281[Processing Stage]\r\nstyle a759a07029077edc5c37fea0326fa281 fill:#afd388b5,stroke:#a4ca7a\r\nsubgraph d9f2c7ced7f00879629c15363c8e307d[alice.please.log.todos.todos.AlicePleaseLogTodosDataFlow:guess_repo_string_is_url]\r\nstyle d9f2c7ced7f00879629c15363c8e307d fill:#fff4de,stroke:#cece71\r\n37178be7db9283b44a1786fef58ffa8d[alice.please.log.todos.todos.AlicePleaseLogTodosDataFlow:guess_repo_string_is_url]\r\n5c7743e872c165030dcf051c712106fc(repo_string)\r\n5c7743e872c165030dcf051c712106fc --> 37178be7db9283b44a1786fef58ffa8d\r\n8d32e3f614b2c8f9d23e7469eaa1da12(result)\r\n37178be7db9283b44a1786fef58ffa8d --> 8d32e3f614b2c8f9d23e7469eaa1da12\r\nend\r\nsubgraph ed8e05e445eabbcfc1a201e580b1371e[alice.please.log.todos.todos.AlicePleaseLogTodosDataFlow:guessed_repo_string_is_operations_git_url]\r\nstyle ed8e05e445eabbcfc1a201e580b1371e fill:#fff4de,stroke:#cece71\r\nf129d360149fb01bbfe1ed8c2f9bbaa2[alice.please.log.todos.todos.AlicePleaseLogTodosDataFlow:guessed_repo_string_is_operations_git_url]\r\n77a8695545cb64a7becb9f50343594c3(repo_url)\r\n77a8695545cb64a7becb9f50343594c3 --> f129d360149fb01bbfe1ed8c2f9bbaa2\r\nd259a05785074877b9509ed686e03b3a(result)\r\nf129d360149fb01bbfe1ed8c2f9bbaa2 --> d259a05785074877b9509ed686e03b3a\r\nend\r\nsubgraph 0fb0b360e14eb7776112a5eaff5252de[alice.please.log.todos.todos.OverlayCLI:cli_has_repos]\r\nstyle 0fb0b360e14eb7776112a5eaff5252de fill:#fff4de,stroke:#cece71\r\n81202a774dfaa2c4d640d25b4d6c0e55[alice.please.log.todos.todos.OverlayCLI:cli_has_repos]\r\n7ba42765e6fba6206fd3d0d7906f6bf3(cmd)\r\n7ba42765e6fba6206fd3d0d7906f6bf3 --> 81202a774dfaa2c4d640d25b4d6c0e55\r\n904eb6737636f1d32a6d890f449e9081(result)\r\n81202a774dfaa2c4d640d25b4d6c0e55 --> 904eb6737636f1d32a6d890f449e9081\r\nend\r\nsubgraph 964c0fbc5f3a43fce3f0d9f0aed08981[alice.please.log.todos.todos.OverlayCLI:cli_is_meant_on_this_repo]\r\nstyle 964c0fbc5f3a43fce3f0d9f0aed08981 fill:#fff4de,stroke:#cece71\r\nb96195c439c96fa7bb4a2d616bbe47c5[alice.please.log.todos.todos.OverlayCLI:cli_is_meant_on_this_repo]\r\n2a071a453a1e677a127cee9775d0fd9f(cmd)\r\n2a071a453a1e677a127cee9775d0fd9f --> b96195c439c96fa7bb4a2d616bbe47c5\r\nf6bfde5eece6eb52bb4b4a3dbc945d9f(result)\r\nb96195c439c96fa7bb4a2d616bbe47c5 --> f6bfde5eece6eb52bb4b4a3dbc945d9f\r\nend\r\nsubgraph 2e2e8520e9f9420ffa9e54ea29965019[alice.please.log.todos.todos.OverlayCLI:cli_run_on_repo]\r\nstyle 2e2e8520e9f9420ffa9e54ea29965019 fill:#fff4de,stroke:#cece71\r\nf60739d83ceeff1b44a23a6c1be4e92c[alice.please.log.todos.todos.OverlayCLI:cli_run_on_repo]\r\n0ac5645342c7e58f9c227a469d90242e(repo)\r\n0ac5645342c7e58f9c227a469d90242e --> f60739d83ceeff1b44a23a6c1be4e92c\r\n6e82a330ad9fcc12d0ad027136fc3732(result)\r\nf60739d83ceeff1b44a23a6c1be4e92c --> 6e82a330ad9fcc12d0ad027136fc3732\r\nend\r\nsubgraph 49130011bcac425879a677c5486ff0cc[alice.please.log.todos.todos:gh_issue_create_code_of_conduct]\r\nstyle 49130011bcac425879a677c5486ff0cc fill:#fff4de,stroke:#cece71\r\n31c8b817615cfd43254dba99ea2586cb[alice.please.log.todos.todos:gh_issue_create_code_of_conduct]\r\n5066ca1af8926ae2c081d71233288d58(body)\r\n5066ca1af8926ae2c081d71233288d58 --> 31c8b817615cfd43254dba99ea2586cb\r\na429b8b3ec4b6cd90e9c697a3330b012(file_present)\r\na429b8b3ec4b6cd90e9c697a3330b012 --> 31c8b817615cfd43254dba99ea2586cb\r\nccd02a25d1ee7e94729a758b676b7050(repo)\r\nccd02a25d1ee7e94729a758b676b7050 --> 31c8b817615cfd43254dba99ea2586cb\r\nabe38e44e9660841c1abe25ec6ba5ff3(title)\r\nabe38e44e9660841c1abe25ec6ba5ff3 --> 31c8b817615cfd43254dba99ea2586cb\r\nc704cbd635083d06f8d11109ded0597d(issue_url)\r\n31c8b817615cfd43254dba99ea2586cb --> c704cbd635083d06f8d11109ded0597d\r\nend\r\nsubgraph 4613afaf00bf0fb8f861ba8a80e664bc[alice.please.log.todos.todos:gh_issue_create_contributing]\r\nstyle 4613afaf00bf0fb8f861ba8a80e664bc fill:#fff4de,stroke:#cece71\r\na243f5b589a38383012170167e99bee9[alice.please.log.todos.todos:gh_issue_create_contributing]\r\ne891bc5f6cc73351082f3f93b486d702(body)\r\ne891bc5f6cc73351082f3f93b486d702 --> a243f5b589a38383012170167e99bee9\r\n633e21066f9a79ca7a0c580486d1a9e9(file_present)\r\n633e21066f9a79ca7a0c580486d1a9e9 --> a243f5b589a38383012170167e99bee9\r\n4aaa89e2af6f5c3bc457139808c7cecb(repo)\r\n4aaa89e2af6f5c3bc457139808c7cecb --> a243f5b589a38383012170167e99bee9\r\nbaa9fd440df8cd74a8e3e987077068fd(title)\r\nbaa9fd440df8cd74a8e3e987077068fd --> a243f5b589a38383012170167e99bee9\r\nc672fc455bc58d3fe05f0af332cfb8f2(issue_url)\r\na243f5b589a38383012170167e99bee9 --> c672fc455bc58d3fe05f0af332cfb8f2\r\nend\r\nsubgraph 7772f7447cabfad14065ddf1ad712a0f[alice.please.log.todos.todos:gh_issue_create_readme]\r\nstyle 7772f7447cabfad14065ddf1ad712a0f fill:#fff4de,stroke:#cece71\r\n90c6b15432ca7a4081208f659e5c809b[alice.please.log.todos.todos:gh_issue_create_readme]\r\ndf9081024c299071492b0f54df68ee10(body)\r\ndf9081024c299071492b0f54df68ee10 --> 90c6b15432ca7a4081208f659e5c809b\r\na3a402edf5e037041b2cc3714d9a6970(file_present)\r\na3a402edf5e037041b2cc3714d9a6970 --> 90c6b15432ca7a4081208f659e5c809b\r\n3eabfefcbc7ad816c89a983dcfebb66e(repo)\r\n3eabfefcbc7ad816c89a983dcfebb66e --> 90c6b15432ca7a4081208f659e5c809b\r\n78e47e381d0a2d2aba099b60a43d59b7(title)\r\n78e47e381d0a2d2aba099b60a43d59b7 --> 90c6b15432ca7a4081208f659e5c809b\r\nab4cc56bd2c79c32bec4c6e1cbdea717(issue_url)\r\n90c6b15432ca7a4081208f659e5c809b --> ab4cc56bd2c79c32bec4c6e1cbdea717\r\nend\r\nsubgraph 259dd82d03b72e83f5594fb70e224c7d[alice.please.log.todos.todos:gh_issue_create_security]\r\nstyle 259dd82d03b72e83f5594fb70e224c7d fill:#fff4de,stroke:#cece71\r\n157d90c800047d63c2e9fbc994007c0b[alice.please.log.todos.todos:gh_issue_create_security]\r\na20e86e85c1ec2f0340182025acfa192(body)\r\na20e86e85c1ec2f0340182025acfa192 --> 157d90c800047d63c2e9fbc994007c0b\r\n1195a910ea74b27c6eba7a58c13810dc(file_present)\r\n1195a910ea74b27c6eba7a58c13810dc --> 157d90c800047d63c2e9fbc994007c0b\r\n24e86931fc4eb531ba30a1457b5844a2(repo)\r\n24e86931fc4eb531ba30a1457b5844a2 --> 157d90c800047d63c2e9fbc994007c0b\r\n596eedb0a320d0a1549018637df28b39(title)\r\n596eedb0a320d0a1549018637df28b39 --> 157d90c800047d63c2e9fbc994007c0b\r\n106ceb5a00f7f2d8cb56bfea7dd69137(issue_url)\r\n157d90c800047d63c2e9fbc994007c0b --> 106ceb5a00f7f2d8cb56bfea7dd69137\r\nend\r\nsubgraph b8e0594907ccea754b3030ffc4bdc3fc[alice.please.log.todos.todos:gh_issue_create_support]\r\nstyle b8e0594907ccea754b3030ffc4bdc3fc fill:#fff4de,stroke:#cece71\r\n6aeac86facce63760e4a81b604cfab0b[alice.please.log.todos.todos:gh_issue_create_support]\r\n18f9a62bdd22ede12d6ea5eac5490ff2(body)\r\n18f9a62bdd22ede12d6ea5eac5490ff2 --> 6aeac86facce63760e4a81b604cfab0b\r\ndace6da55abe2ab1c5c9a0ced2f6833d(file_present)\r\ndace6da55abe2ab1c5c9a0ced2f6833d --> 6aeac86facce63760e4a81b604cfab0b\r\nd2a58f644d7427227cefd56492dfcef9(repo)\r\nd2a58f644d7427227cefd56492dfcef9 --> 6aeac86facce63760e4a81b604cfab0b\r\n9ba4bcdc22dcbab276f68288bfb4d0b1(title)\r\n9ba4bcdc22dcbab276f68288bfb4d0b1 --> 6aeac86facce63760e4a81b604cfab0b\r\n7f2eb20bcd650dc00cde5ca0355b578f(issue_url)\r\n6aeac86facce63760e4a81b604cfab0b --> 7f2eb20bcd650dc00cde5ca0355b578f\r\nend\r\nsubgraph cd002409ac60a3eea12f2139f2743c52[alice.please.log.todos.todos:git_repo_to_git_repository_checked_out]\r\nstyle cd002409ac60a3eea12f2139f2743c52 fill:#fff4de,stroke:#cece71\r\ne58ba0b1a7efba87321e9493d340767b[alice.please.log.todos.todos:git_repo_to_git_repository_checked_out]\r\n00a9f6e30ea749940657f87ef0a1f7c8(repo)\r\n00a9f6e30ea749940657f87ef0a1f7c8 --> e58ba0b1a7efba87321e9493d340767b\r\nbb1abf628d6e8985c49381642959143b(repo)\r\ne58ba0b1a7efba87321e9493d340767b --> bb1abf628d6e8985c49381642959143b\r\nend\r\nsubgraph d3ec0ac85209a7256c89d20f758f09f4[check_if_valid_git_repository_URL]\r\nstyle d3ec0ac85209a7256c89d20f758f09f4 fill:#fff4de,stroke:#cece71\r\nf577c71443f6b04596b3fe0511326c40[check_if_valid_git_repository_URL]\r\n7440e73a8e8f864097f42162b74f2762(URL)\r\n7440e73a8e8f864097f42162b74f2762 --> f577c71443f6b04596b3fe0511326c40\r\n8e39b501b41c5d0e4596318f80a03210(valid)\r\nf577c71443f6b04596b3fe0511326c40 --> 8e39b501b41c5d0e4596318f80a03210\r\nend\r\nsubgraph af8da22d1318d911f29b95e687f87c5d[clone_git_repo]\r\nstyle af8da22d1318d911f29b95e687f87c5d fill:#fff4de,stroke:#cece71\r\n155b8fdb5524f6bfd5adbae4940ad8d5[clone_git_repo]\r\need77b9eea541e0c378c67395351099c(URL)\r\need77b9eea541e0c378c67395351099c --> 155b8fdb5524f6bfd5adbae4940ad8d5\r\n8b5928cd265dd2c44d67d076f60c8b05(ssh_key)\r\n8b5928cd265dd2c44d67d076f60c8b05 --> 155b8fdb5524f6bfd5adbae4940ad8d5\r\n4e1d5ea96e050e46ebf95ebc0713d54c(repo)\r\n155b8fdb5524f6bfd5adbae4940ad8d5 --> 4e1d5ea96e050e46ebf95ebc0713d54c\r\n6a44de06a4a3518b939b27c790f6cdce{valid_git_repository_URL}\r\n6a44de06a4a3518b939b27c790f6cdce --> 155b8fdb5524f6bfd5adbae4940ad8d5\r\nend\r\nsubgraph 98179e1c9444a758d9565431f371b232[dffml_operations_innersource.operations:code_of_conduct_present]\r\nstyle 98179e1c9444a758d9565431f371b232 fill:#fff4de,stroke:#cece71\r\nfb772128fdc785ce816c73128e0afd4d[dffml_operations_innersource.operations:code_of_conduct_present]\r\nf333b126c62bdbf832dddf105278d218(repo)\r\nf333b126c62bdbf832dddf105278d218 --> fb772128fdc785ce816c73128e0afd4d\r\n1233aac886e50641252dcad2124003c9(result)\r\nfb772128fdc785ce816c73128e0afd4d --> 1233aac886e50641252dcad2124003c9\r\nend\r\nsubgraph d03657cbeff4a7501071526c5227d605[dffml_operations_innersource.operations:contributing_present]\r\nstyle d03657cbeff4a7501071526c5227d605 fill:#fff4de,stroke:#cece71\r\n8da2c8a3eddf27e38838c8b6a2cd4ad1[dffml_operations_innersource.operations:contributing_present]\r\n2a1ae8bcc9add3c42e071d0557e98b1c(repo)\r\n2a1ae8bcc9add3c42e071d0557e98b1c --> 8da2c8a3eddf27e38838c8b6a2cd4ad1\r\n52544c54f59ff4838d42ba3472b02589(result)\r\n8da2c8a3eddf27e38838c8b6a2cd4ad1 --> 52544c54f59ff4838d42ba3472b02589\r\nend\r\nsubgraph 3ab6f933ff2c5d1c31f5acce50ace507[dffml_operations_innersource.operations:readme_present]\r\nstyle 3ab6f933ff2c5d1c31f5acce50ace507 fill:#fff4de,stroke:#cece71\r\nae6634d141e4d989b0f53fd3b849b101[dffml_operations_innersource.operations:readme_present]\r\n4d289d268d52d6fb5795893363300585(repo)\r\n4d289d268d52d6fb5795893363300585 --> ae6634d141e4d989b0f53fd3b849b101\r\n65fd35d17d8a7e96c9f7e6aaedb75e3c(result)\r\nae6634d141e4d989b0f53fd3b849b101 --> 65fd35d17d8a7e96c9f7e6aaedb75e3c\r\nend\r\nsubgraph da39b149b9fed20f273450b47a0b65f4[dffml_operations_innersource.operations:security_present]\r\nstyle da39b149b9fed20f273450b47a0b65f4 fill:#fff4de,stroke:#cece71\r\nc8921544f4665e73080cb487aef7de94[dffml_operations_innersource.operations:security_present]\r\ne682bbcfad20caaab15e4220c81e9239(repo)\r\ne682bbcfad20caaab15e4220c81e9239 --> c8921544f4665e73080cb487aef7de94\r\n5d69c4e5b3601abbd692ade806dcdf5f(result)\r\nc8921544f4665e73080cb487aef7de94 --> 5d69c4e5b3601abbd692ade806dcdf5f\r\nend\r\nsubgraph 062b8882104862540d584516edc60008[dffml_operations_innersource.operations:support_present]\r\nstyle 062b8882104862540d584516edc60008 fill:#fff4de,stroke:#cece71\r\n5cc75c20aee40e815abf96726508b66d[dffml_operations_innersource.operations:support_present]\r\nf0e4cd91ca4f6b278478180a188a2f5f(repo)\r\nf0e4cd91ca4f6b278478180a188a2f5f --> 5cc75c20aee40e815abf96726508b66d\r\n46bd597a57e034f669df18ac9ae0a153(result)\r\n5cc75c20aee40e815abf96726508b66d --> 46bd597a57e034f669df18ac9ae0a153\r\nend\r\nsubgraph 55a339b2b9140e7d9c3448e706288e6e[operations.innersource.dffml_operations_innersource.cli:github_repo_id_to_clone_url]\r\nstyle 55a339b2b9140e7d9c3448e706288e6e fill:#fff4de,stroke:#cece71\r\ne90587117185b90364bd54700bfd4e3b[operations.innersource.dffml_operations_innersource.cli:github_repo_id_to_clone_url]\r\n725810a22f04a3ff620021588233815f(repo_id)\r\n725810a22f04a3ff620021588233815f --> e90587117185b90364bd54700bfd4e3b\r\nd2ee13433e404b6ef59d0f0344e28e2f(result)\r\ne90587117185b90364bd54700bfd4e3b --> d2ee13433e404b6ef59d0f0344e28e2f\r\nend\r\nend\r\nsubgraph a4827add25f5c7d5895c5728b74e2beb[Cleanup Stage]\r\nstyle a4827add25f5c7d5895c5728b74e2beb fill:#afd388b5,stroke:#a4ca7a\r\nend\r\nsubgraph 58ca4d24d2767176f196436c2890b926[Output Stage]\r\nstyle 58ca4d24d2767176f196436c2890b926 fill:#afd388b5,stroke:#a4ca7a\r\nend\r\nsubgraph inputs[Inputs]\r\nstyle inputs fill:#f6dbf9,stroke:#a178ca\r\n6e82a330ad9fcc12d0ad027136fc3732 --> 5c7743e872c165030dcf051c712106fc\r\n8d32e3f614b2c8f9d23e7469eaa1da12 --> 77a8695545cb64a7becb9f50343594c3\r\n128516cfa09b0383023eab52ee24878a(seed<br>dffml.util.cli.CMD)\r\n128516cfa09b0383023eab52ee24878a --> 7ba42765e6fba6206fd3d0d7906f6bf3\r\n128516cfa09b0383023eab52ee24878a(seed<br>dffml.util.cli.CMD)\r\n128516cfa09b0383023eab52ee24878a --> 2a071a453a1e677a127cee9775d0fd9f\r\n904eb6737636f1d32a6d890f449e9081 --> 0ac5645342c7e58f9c227a469d90242e\r\nf6bfde5eece6eb52bb4b4a3dbc945d9f --> 0ac5645342c7e58f9c227a469d90242e\r\n25d4e646671f80ac105f05de50445ba5(seed<br>CodeOfConductIssueBody)\r\n25d4e646671f80ac105f05de50445ba5 --> 5066ca1af8926ae2c081d71233288d58\r\n1233aac886e50641252dcad2124003c9 --> a429b8b3ec4b6cd90e9c697a3330b012\r\nbb1abf628d6e8985c49381642959143b --> ccd02a25d1ee7e94729a758b676b7050\r\n44ec56a4fd4b5eea9c8523dcb881d2d1(seed<br>CodeOfConductIssueTitle)\r\n44ec56a4fd4b5eea9c8523dcb881d2d1 --> abe38e44e9660841c1abe25ec6ba5ff3\r\nc94383981c3a071b8c3df7293c8c7c92(seed<br>ContributingIssueBody)\r\nc94383981c3a071b8c3df7293c8c7c92 --> e891bc5f6cc73351082f3f93b486d702\r\n52544c54f59ff4838d42ba3472b02589 --> 633e21066f9a79ca7a0c580486d1a9e9\r\nbb1abf628d6e8985c49381642959143b --> 4aaa89e2af6f5c3bc457139808c7cecb\r\n90c6a88275f27b28dc12f5741ac1652f(seed<br>ContributingIssueTitle)\r\n90c6a88275f27b28dc12f5741ac1652f --> baa9fd440df8cd74a8e3e987077068fd\r\n1daacccd02f8117e67ad3cb8686a732c(seed<br>ReadmeIssueBody)\r\n1daacccd02f8117e67ad3cb8686a732c --> df9081024c299071492b0f54df68ee10\r\n65fd35d17d8a7e96c9f7e6aaedb75e3c --> a3a402edf5e037041b2cc3714d9a6970\r\nbb1abf628d6e8985c49381642959143b --> 3eabfefcbc7ad816c89a983dcfebb66e\r\n0c1ab2d4bda10e1083557833ae5c5da4(seed<br>ReadmeIssueTitle)\r\n0c1ab2d4bda10e1083557833ae5c5da4 --> 78e47e381d0a2d2aba099b60a43d59b7\r\nb076a6070cf7626bccd630198450637c(seed<br>SecurityIssueBody)\r\nb076a6070cf7626bccd630198450637c --> a20e86e85c1ec2f0340182025acfa192\r\n5d69c4e5b3601abbd692ade806dcdf5f --> 1195a910ea74b27c6eba7a58c13810dc\r\nbb1abf628d6e8985c49381642959143b --> 24e86931fc4eb531ba30a1457b5844a2\r\nd734943b101c6e465df8c4cabe9b872e(seed<br>SecurityIssueTitle)\r\nd734943b101c6e465df8c4cabe9b872e --> 596eedb0a320d0a1549018637df28b39\r\na7f3a4f2059bb4b3c170322febb4e93f(seed<br>SupportIssueBody)\r\na7f3a4f2059bb4b3c170322febb4e93f --> 18f9a62bdd22ede12d6ea5eac5490ff2\r\n46bd597a57e034f669df18ac9ae0a153 --> dace6da55abe2ab1c5c9a0ced2f6833d\r\nbb1abf628d6e8985c49381642959143b --> d2a58f644d7427227cefd56492dfcef9\r\n2ae304b14108a13de9dfa57f1e77cc2f(seed<br>SupportIssueTitle)\r\n2ae304b14108a13de9dfa57f1e77cc2f --> 9ba4bcdc22dcbab276f68288bfb4d0b1\r\n4e1d5ea96e050e46ebf95ebc0713d54c --> 00a9f6e30ea749940657f87ef0a1f7c8\r\nd259a05785074877b9509ed686e03b3a --> 7440e73a8e8f864097f42162b74f2762\r\nd2ee13433e404b6ef59d0f0344e28e2f --> 7440e73a8e8f864097f42162b74f2762\r\nd259a05785074877b9509ed686e03b3a --> eed77b9eea541e0c378c67395351099c\r\nd2ee13433e404b6ef59d0f0344e28e2f --> eed77b9eea541e0c378c67395351099c\r\na6ed501edbf561fda49a0a0a3ca310f0(seed<br>git_repo_ssh_key)\r\na6ed501edbf561fda49a0a0a3ca310f0 --> 8b5928cd265dd2c44d67d076f60c8b05\r\n8e39b501b41c5d0e4596318f80a03210 --> 6a44de06a4a3518b939b27c790f6cdce\r\nbb1abf628d6e8985c49381642959143b --> f333b126c62bdbf832dddf105278d218\r\nbb1abf628d6e8985c49381642959143b --> 2a1ae8bcc9add3c42e071d0557e98b1c\r\nbb1abf628d6e8985c49381642959143b --> 4d289d268d52d6fb5795893363300585\r\nbb1abf628d6e8985c49381642959143b --> e682bbcfad20caaab15e4220c81e9239\r\nbb1abf628d6e8985c49381642959143b --> f0e4cd91ca4f6b278478180a188a2f5f\r\n090b151d70cc5b37562b42c64cb16bb0(seed<br>GitHubRepoID)\r\n090b151d70cc5b37562b42c64cb16bb0 --> 725810a22f04a3ff620021588233815f\r\nend\r\n```\r\n\r\n- The flow looks fine the way it's wired in the above mermaid diagram\r\n - Guessing it's an issue with `subflow` and the multi-context `run()`.\r\n - HEAD: f61bd161aa738ede314723b6bbb9667449abdd67\r\n\r\n```console\r\n$ alice please log todos -log debug -keys https://github.com/pdxjohnny/testaaa\r\n$ for repo_url in $(echo https://github.com/pdxjohnny/testaaa); do gh issue list --search \"Recommended Community Standard:\" -R \"${repo_url}\" | grep -v '2022-11-05'; done\r\n59 OPEN Recommended Community Standard: SUPPORT 2022-11-17 17:05:08 +0000 UTC\r\n58 OPEN Recommended Community Standard: SECURITY 2022-11-17 17:05:06 +0000 UTC\r\n57 OPEN Recommended Community Standard: README 2022-11-17 17:05:05 +0000 UTC\r\n56 OPEN Recommended Community Standard: CONTRIBUTING 2022-11-17 17:05:04 +0000 UTC\r\n6 OPEN Recommended Community Standard: SUPPORT 2022-11-04 06:33:26 +0000 UTC\r\n5 OPEN Recommended Community Standard: SUPPORT 2022-11-04 06:28:41 +0000 UTC\r\n4 OPEN Recommended Community Standard: SUPPORT 2022-11-04 06:27:42 +0000 UTC\r\n55 OPEN Recommended Community Standard: CODE_OF_CONDUCT 2022-11-17 17:05:02 +0000 UTC\r\n1 OPEN Recommended Community Standard: README 2022-06-25 01:12:18 +0000 UTC\r\n2 OPEN Recommended Community Standards 2022-06-25 01:12:20 +0000 UTC\r\n```\r\n\r\n- Unclear what's up, going to send and just close duplicates\r\n\r\n```console\r\n$ grep Stage:\\ PROCESSING .output.2022-11-16T20:49:13+00:00.txt\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:operations.innersource.dffml_operations_innersource.cli:github_repo_id_to_clone_url Stage: PROCESSING: operations.innersource.dffml_operations_innersource.cli:github_repo_id_to_clone_url\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.log.todos.todos.OverlayCLI:cli_has_repos Stage: PROCESSING: alice.please.log.todos.todos.OverlayCLI:cli_has_repos\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.log.todos.todos.OverlayCLI:cli_is_meant_on_this_repo Stage: PROCESSING: alice.please.log.todos.todos.OverlayCLI:cli_is_meant_on_this_repo\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.log.todos.todos.OverlayCLI:cli_run_on_repo Stage: PROCESSING: alice.please.log.todos.todos.OverlayCLI:cli_run_on_repo\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.log.todos.todos.AlicePleaseLogTodosDataFlow:guess_repo_string_is_url Stage: PROCESSING: alice.please.log.todos.todos.AlicePleaseLogTodosDataFlow:guess_repo_string_is_url\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:operations.innersource.dffml_operations_innersource.cli:github_repo_id_to_clone_url Stage: PROCESSING: operations.innersource.dffml_operations_innersource.cli:github_repo_id_to_clone_url\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:check_if_valid_git_repository_URL Stage: PROCESSING: check_if_valid_git_repository_URL\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:check_if_valid_git_repository_URL Stage: PROCESSING: check_if_valid_git_repository_URL\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:clone_git_repo Stage: PROCESSING: clone_git_repo\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:clone_git_repo Stage: PROCESSING: clone_git_repo\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.log.todos.todos:git_repo_to_git_repository_checked_out Stage: PROCESSING: alice.please.log.todos.todos:git_repo_to_git_repository_checked_out\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:dffml_operations_innersource.operations:code_of_conduct_present Stage: PROCESSING: dffml_operations_innersource.operations:code_of_conduct_present\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:dffml_operations_innersource.operations:contributing_present Stage: PROCESSING: dffml_operations_innersource.operations:contributing_present\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:dffml_operations_innersource.operations:readme_present Stage: PROCESSING: dffml_operations_innersource.operations:readme_present\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:dffml_operations_innersource.operations:security_present Stage: PROCESSING: dffml_operations_innersource.operations:security_present\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:dffml_operations_innersource.operations:support_present Stage: PROCESSING: dffml_operations_innersource.operations:support_present\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.log.todos.todos:gh_issue_create_code_of_conduct Stage: PROCESSING: alice.please.log.todos.todos:gh_issue_create_code_of_conduct\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.log.todos.todos:git_repo_to_git_repository_checked_out Stage: PROCESSING: alice.please.log.todos.todos:git_repo_to_git_repository_checked_out\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:dffml_operations_innersource.operations:code_of_conduct_present Stage: PROCESSING: dffml_operations_innersource.operations:code_of_conduct_present\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:dffml_operations_innersource.operations:contributing_present Stage: PROCESSING: dffml_operations_innersource.operations:contributing_present\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:dffml_operations_innersource.operations:readme_present Stage: PROCESSING: dffml_operations_innersource.operations:readme_present\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:dffml_operations_innersource.operations:security_present Stage: PROCESSING: dffml_operations_innersource.operations:security_present\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:dffml_operations_innersource.operations:support_present Stage: PROCESSING: dffml_operations_innersource.operations:support_present\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.log.todos.todos:gh_issue_create_code_of_conduct Stage: PROCESSING: alice.please.log.todos.todos:gh_issue_create_code_of_conduct\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.log.todos.todos:gh_issue_create_contributing Stage: PROCESSING: alice.please.log.todos.todos:gh_issue_create_contributing\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.log.todos.todos:gh_issue_create_contributing Stage: PROCESSING: alice.please.log.todos.todos:gh_issue_create_contributing\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.log.todos.todos:gh_issue_create_readme Stage: PROCESSING: alice.please.log.todos.todos:gh_issue_create_readme\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.log.todos.todos:gh_issue_create_security Stage: PROCESSING: alice.please.log.todos.todos:gh_issue_create_security\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.log.todos.todos:gh_issue_create_readme Stage: PROCESSING: alice.please.log.todos.todos:gh_issue_create_readme\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.log.todos.todos:gh_issue_create_security Stage: PROCESSING: alice.please.log.todos.todos:gh_issue_create_security\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.log.todos.todos:gh_issue_create_support Stage: PROCESSING: alice.please.log.todos.todos:gh_issue_create_support\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.log.todos.todos:gh_issue_create_support Stage: PROCESSING: alice.please.log.todos.todos:gh_issue_create_support\r\n$ do alice please log todos -log debug -record-def GitHubRepoID -keys \"${github_repo_id}\" 2>&1 | tee .output.$(date -Iseconds).txt\r\n```\r\n\r\n- https://github.com/decentralized-identity/credential-manifest/issues/125#issuecomment-1310728595\r\n - No movement on this yet\r\n - Checked for other signs of life in [kimdhamilton](https://github.com/kimdhamilton)'s trains of thought (aka recent activity on GitHub)\r\n - https://github.com/centrehq/verite\r\n - https://verite.id/verite\r\n - Ding ding ding!\r\n- TODO\r\n - [x] Partial left handed mouse day\r\n - Back left base of neck headache? Related?\r\n - Butterfly keyboard for even a few minutes has made me nauseous, not sure if related.\r\n - [ ] Review https://docs.github.com/en/actions/deployment/security-hardening-your-deployments/about-security-hardening-with-openid-connect#getting-started-with-oidc\r\n - [ ] Perhaps reuse if license allows within OpenSSF metrics doc if it would help, unknown haven't read yet.\r\n - [ ] Prototype infra docs as YAML as overlay with SaaSBOM or OBOM or whatever it was that's applicable\r\n - [ ] Review ideas for dev automation dataflows https://github.com/pdxjohnny/pdxjohnny.github.io/commit/328aee6351d3d12f72abe93b5be0bcacea64c3ef and update Alice docs accordingly\r\n - [ ] Sync opened tabs synced to shell context active synced to engineering logs\r\n - https://developer.chrome.com/docs/extensions/reference/tabs/\r\n - https://github.com/pdxjohnny/pdxjohnny.github.io/blob/abfa83255d77eaaf35f92593828ba7a6a7001fb3/content/posts/dev-environment.md?plain=1#L116-L119\r\n - [ ] Debug double issue creation\r\n - [ ] Log `GraphQL: was submitted too quickly (createIssue)` issues, deal with? Add retry?\r\n - [ ] Get back to Elsa with learning methodologies similarity thing, grep?\r\n - [ ] Document two then devs working together\r\n - See poly repo pull model CR0/4 example (which also talked to Kees about yesterday at meetup) https://github.com/intel/dffml/issues/1315#issuecomment-1066971630\r\n - [ ] Start Vol 4 with whatever was in the notes about it recently, can't remember right now\r\n - [x] Matt nodded in relation to SCITT\r\n - [x] Marc might pursure matrix manifest approach for Zephyr build to test handoff\r\n - [x] Several conversations about CD and manifests\r\n - Mentioned #1061\r\n - Forgot to mention and there is something related to #1207...\r\n - [ ] NVDStyle as first stab at stream of consciousness to find vuln via cve-bin-tool (mock output if need be to \"find\" vuln)\r\n - [ ] Trigger rebuild of wheel and push to GitHub releases\r\n - [ ] `alice please contribute cicd` to run templating on the GitHub Actions,\r\n `workflow_dispatch` style (that calls reusable).\r\n - [ ] Do DevCloud demo\r\n - https://github.com/intel/dffml/issues/1247\r\n - Spin DevCloud deploy GitHub Actions Runner and hermetic build \ud83e\udd19 with manifests and SCITT receipts the DFFML main package\r\n - `DevCloudOrchestrator`?"
}
]
},
{
"body": "# 2022-11-18 Engineering Logs",
"replies": [
{
"body": "## 2022-11-18 @pdxjohnny Engineering Logs\r\n\r\n- https://social-embed.git-pull.com/docs/wc/\r\n - This looks interesting\r\n - https://oembed.com/\r\n - > oEmbed is a format for allowing an embedded representation of a URL on third party sites. The simple API allows a website to display embedded content (such as photos or videos) when a user posts a link to that resource, without having to parse the resource directly.\r\n- https://ocaml.org\r\n - Used for Linux kernel semantic patches\r\n- https://github.com/cue-lang/cue\r\n - Need to play with Cue language\r\n- GitHub Actions templates docs\r\n - [Reusable workflows]()\u00a0are identified by the presence of [`on.workflow_call`](https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#onworkflow_call) an example of a reusable workflow for container builds following the [manifest](https://github.com/intel/dffml/blob/alice/docs/arch/0008-Manifest.md) pattern can be found \u200bin the [`*build_images_containers.yml` files](https://github.com/intel/dffml/blob/main/.github/workflows/build_images_containers.yml).\r\n- GitHub Action runner support SCITT receipts on containers / actions\r\n- `podman` support SCITT recpits\r\n- https://ariadne.space/2019/07/13/federation-what-flows-where-and-why/\r\n - > most of the risks described here are mitigated by telling mastodon to use authorized fetch mode. please turn authorized fetch mode on, for your own good.\r\n- https://hacker.solar/books/about-this-site/page/what-is-hacker-solar\r\n- https://github.com/intel/cve-bin-tool/issues/2334#issuecomment-1315643093\r\n - https://social.treehouse.systems/@ariadne/109365116698192103\r\n - We are going to try to hybridize the authroized fetch mode with SCITT receipts and then bridge that into web5\r\n - Also touched on recent OIDC verification via notary\r\n- Need to remove time from tmux for idle time to work so that it doesn't tick every second and make giant files when there is no new output other than the time\r\n - https://github.com/git-pull/tao-of-tmux/blob/master/manuscript/10-scripting.md#formats-formats\r\n\r\n```console\r\n$ nodemon -e py --exec 'clear; nvd_api_key=$NVD_API_KEY LONG_TESTS=1 timeout 10s python3.10 -um coverage run -m pytest -v --log-level=DEBUG --log-cli-level=DEBUG test/test_nvd_api.py::TestNVD_API::test_total_results_count -k 2.0; test 1'\r\n...\r\n___________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________ TestNVD_API.test_total_results_count[2.0-feed1-stats1] ____________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________\r\n\r\nself = <test.test_nvd_api.TestNVD_API object at 0x7f8dcaa7cf70>, api_version = '2.0', feed = <httptest.httptest.Server object at 0x7f8dcaa7c7c0>, stats = <httptest.httptest.Server object at 0x7f8dcaa7c700>\r\n...\r\n> assert len(nvd_api.all_cve_entries) >= nvd_api.total_results\r\nE assert 0 >= 10\r\n...\r\ntest/test_nvd_api.py:88: AssertionError\r\n--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Captured log setup ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------\r\nDEBUG asyncio:selector_events.py:54 Using selector: EpollSelector\r\nDEBUG asyncio:selector_events.py:54 Using selector: EpollSelector\r\n-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------\r\nFetching incremental metadata from NVD... \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 0% -:--:--\r\nDownloading Feeds from NVD... \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 100% 0:00:00\r\n-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Captured stderr call ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------\r\n127.0.0.1 - - [18/Nov/2022 08:38:09] \"GET /?reporttype=countsbystatus HTTP/1.1\" 200 -\r\n127.0.0.1 - - [18/Nov/2022 08:38:09] \"GET /2.0?startIndex=0&resultsPerPage=1 HTTP/1.1\" 200 -\r\n127.0.0.1 - - [18/Nov/2022 08:38:09] \"GET /2.0?startIndex=0&resultsPerPage=2000&lastModStartDate=2022-11-16T16:36:09:895&lastModEndDate=2022-11-18T16:38:09:902 HTTP/1.1\" 200 -\r\n127.0.0.1 - - [18/Nov/2022 08:38:12] \"GET /2.0?startIndex=0&resultsPerPage=2000&lastModStartDate=2022-11-16T16:36:09:895&lastModEndDate=2022-11-18T16:38:09:902 HTTP/1.1\" 200 -\r\n127.0.0.1 - - [18/Nov/2022 08:38:12] \"GET /2.0?startIndex=2000&resultsPerPage=2000&lastModStartDate=2022-11-16T16:36:09:895&lastModEndDate=2022-11-18T16:38:09:902 HTTP/1.1\" 200 -\r\n---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Captured log call ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------\r\nINFO cve_bin_tool.NVD_API:nvd_api.py:135 Fetching metadata from NVD...\r\nDEBUG alice.emulate.nvd.api:nvdstyle.py:158 ParseResult(scheme='', netloc='', path='/', params='', query='reporttype=countsbystatus', fragment='')\r\nDEBUG alice.emulate.nvd.api:nvdstyle.py:163 {'reporttype': ['countsbystatus']}\r\nDEBUG alice.emulate.nvd.api:nvdstyle.py:172 Serving stats...\r\nINFO cve_bin_tool.NVD_API:nvd_api.py:137 Got metadata from NVD: {'Total': 10, 'Rejected': 0, 'Received': 0, 'Modified': 0, 'Undergoing Analysis': 0, 'Awaiting Analysis': 0}\r\nINFO cve_bin_tool.NVD_API:nvd_api.py:140 self.total_results = Total: 10 - Rejected: 0\r\nINFO cve_bin_tool.NVD_API:nvd_api.py:144 Valiating NVD api...\r\nDEBUG alice.emulate.nvd.api:nvdstyle.py:158 ParseResult(scheme='', netloc='', path='/2.0', params='', query='startIndex=0&resultsPerPage=1', fragment='')\r\nDEBUG alice.emulate.nvd.api:nvdstyle.py:163 {'startIndex': ['0'], 'resultsPerPage': ['1']}\r\nDEBUG alice.emulate.nvd.api:nvdstyle.py:240 Serving validate NVD API: start_index: 0 results_per_page: 1...\r\nDEBUG alice.emulate.nvd.api:nvdstyle.py:274 Serving validate: results: {'format': 'NVD_CVE', 'resultsPerPage': 1, 'startIndex': 0, 'timestamp': '2022-11-18T08:38Z', 'totalResults': 10, 'version': '2.0', 'vulnerabilities': [{'cve': {'configurations': [{'nodes': [{'cpeMatch': [{'criteria': 'cpe:2.3:a:eric_allman:sendmail:5.58:*:*:*:*:*:*:*', 'matchCriteriaId': '1D07F493-9C8D-44A4-8652-F28B46CBA27C', 'vulnerable': True}], 'negate': False, 'operator': 'OR'}]}], 'descriptions': [{'lang': 'en', 'value': 'The debug command in Sendmail is enabled, allowing attackers to execute commands as root.'}, {'lang': 'es', 'value': 'El comando de depuraci\u00f3n de Sendmail est\u00e1 activado, permitiendo a atacantes ejecutar comandos como root.'}], 'id': 'CVE-1999-0095', 'lastModified': '2019-06-11T20:29:00.263', 'metrics': {'cvssMetricV2': [{'acInsufInfo': False, 'cvssData': {'accessComplexity': 'LOW', 'accessVector': 'NETWORK', 'authentication': 'NONE', 'availabilityImpact': 'COMPLETE', 'baseScore': 10.0, 'baseSeverity': 'HIGH', 'confidentialityImpact': 'COMPLETE', 'integrityImpact': 'COMPLETE', 'vectorString': 'AV:N/AC:L/Au:N/C:C/I:C/A:C', 'version': '2.0'}, 'exploitabilityScore': 10.0, 'impactScore': 10.0, 'obtainAllPrivilege': True, 'obtainOtherPrivilege': False, 'obtainUserPrivilege': False, 'source': 'nvd@nist.gov', 'type': 'Primary', 'userInteractionRequired': False}]}, 'published': '1988-10-01T04:00:00.000', 'references': [{'source': 'cve@mitre.org', 'url': 'http://seclists.org/fulldisclosure/2019/Jun/16'}, {'source': 'cve@mitre.org', 'url': 'http://www.openwall.com/lists/oss-security/2019/06/05/4'}, {'source': 'cve@mitre.org', 'url': 'http://www.openwall.com/lists/oss-security/2019/06/06/1'}, {'source': 'cve@mitre.org', 'url': 'http://www.securityfocus.com/bid/1'}], 'sourceIdentifier': 'cve@mitre.org', 'vulnStatus': 'Modified', 'weaknesses': [{'description': [{'lang': 'en', 'value': 'NVD-CWE-Other'}], 'source': 'nvd@nist.gov', 'type': 'Primary'}]}}]}\r\nINFO cve_bin_tool.NVD_API:nvd_api.py:146 Valiated NVD api\r\nINFO cve_bin_tool.NVD_API:nvd_api.py:175 Fetching updated CVE entries after 2022-11-16T16:36:09:895\r\nDEBUG alice.emulate.nvd.api:nvdstyle.py:158 ParseResult(scheme='', netloc='', path='/2.0', params='', query='startIndex=0&resultsPerPage=2000&lastModStartDate=2022-11-16T16:36:09:895&lastModEndDate=2022-11-18T16:38:09:902', fragment='')\r\nDEBUG alice.emulate.nvd.api:nvdstyle.py:163 {'startIndex': ['0'], 'resultsPerPage': ['2000'], 'lastModStartDate': ['2022-11-16T16:36:09:895'], 'lastModEndDate': ['2022-11-18T16:38:09:902']}\r\nDEBUG alice.emulate.nvd.api:nvdstyle.py:284 Serving feed: start_index: 0 results_per_page: 2000...\r\nDEBUG alice.emulate.nvd.api:nvdstyle.py:336 Serving feed with 10 results\r\nINFO cve_bin_tool.NVD_API:nvd_api.py:189 Adding 10 CVE entries\r\nDEBUG alice.emulate.nvd.api:nvdstyle.py:158 ParseResult(scheme='', netloc='', path='/2.0', params='', query='startIndex=0&resultsPerPage=2000&lastModStartDate=2022-11-16T16:36:09:895&lastModEndDate=2022-11-18T16:38:09:902', fragment='')\r\nDEBUG alice.emulate.nvd.api:nvdstyle.py:158 ParseResult(scheme='', netloc='', path='/2.0', params='', query='startIndex=2000&resultsPerPage=2000&lastModStartDate=2022-11-16T16:36:09:895&lastModEndDate=2022-11-18T16:38:09:902', fragment='')\r\nDEBUG alice.emulate.nvd.api:nvdstyle.py:163 {'startIndex': ['0'], 'resultsPerPage': ['2000'], 'lastModStartDate': ['2022-11-16T16:36:09:895'], 'lastModEndDate': ['2022-11-18T16:38:09:902']}\r\nDEBUG alice.emulate.nvd.api:nvdstyle.py:163 {'startIndex': ['2000'], 'resultsPerPage': ['2000'], 'lastModStartDate': ['2022-11-16T16:36:09:895'], 'lastModEndDate': ['2022-11-18T16:38:09:902']}\r\nDEBUG alice.emulate.nvd.api:nvdstyle.py:284 Serving feed: start_index: 0 results_per_page: 2000...\r\nDEBUG alice.emulate.nvd.api:nvdstyle.py:284 Serving feed: start_index: 2000 results_per_page: 2000...\r\nDEBUG alice.emulate.nvd.api:nvdstyle.py:336 Serving feed with 10 results\r\nDEBUG alice.emulate.nvd.api:nvdstyle.py:336 Serving feed with 0 results\r\n-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Captured log teardown --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------\r\nDEBUG asyncio:selector_events.py:54 Using selector: EpollSelector\r\n=================================================================================================================================================================================================================================================================================================================== short test summary info ============================================================================================================================================================== - \r\n=====================================================================================================================================================\r\nFAILED test/test_nvd_api.py::TestNVD_API::test_total_results_count[2.0-feed1-stats1] - assert 0 >= 10\r\n=============================================================================================================================================================================================================================================================================================================== 1 failed, 1 deselected in 6.51s ===============================================================================================================================================================================================================================================================================================================\r\n[nodemon] clean exit - waiting for changes before restart\r\n```\r\n\r\n- Ah ha! Enabled debug logging because noticed we weren't seeing the\r\n \"Send Request\" log client side.\r\n\r\n```diff\r\ndiff --git a/cve_bin_tool/log.py b/cve_bin_tool/log.py\r\nindex 85b7009..749b867 100644\r\n--- a/cve_bin_tool/log.py\r\n+++ b/cve_bin_tool/log.py\r\n@@ -30,4 +30,4 @@ logging.basicConfig(\r\n root_logger = logging.getLogger()\r\n \r\n LOGGER = logging.getLogger(__package__)\r\n-LOGGER.setLevel(logging.INFO)\r\n+LOGGER.setLevel(logging.DEBUG)\r\ndiff --git a/cve_bin_tool/nvd_api.py b/cve_bin_tool/nvd_api.py\r\nindex 28bc102..0f82748 100644\r\n--- a/cve_bin_tool/nvd_api.py\r\n+++ b/cve_bin_tool/nvd_api.py\r\n@@ -130,14 +130,20 @@ class NVD_API:\r\n \r\n if not self.session:\r\n connector = aiohttp.TCPConnector(limit_per_host=19)\r\n- self.session = RateLimiter(\r\n- aiohttp.ClientSession(connector=connector, trust_env=True)\r\n- )\r\n+ self.session = aiohttp.ClientSession(connector=connector, trust_env=True)\r\n \r\n self.logger.info(\"Fetching metadata from NVD...\")\r\n cve_count = await self.nvd_count_metadata(self.session, self.stats)\r\n+ self.logger.info(\"Got metadata from NVD: %r\", cve_count)\r\n+\r\n+ self.total_results = cve_count[\"Total\"] - cve_count[\"Rejected\"]\r\n+ self.logger.info(\r\n+ f'self.total_results = Total: {cve_count[\"Total\"]} - Rejected: {cve_count[\"Rejected\"]}'\r\n+ )\r\n \r\n+ self.logger.info(\"Valiating NVD api...\")\r\n await self.validate_nvd_api()\r\n+ self.logger.info(\"Valiated NVD api\")\r\n \r\n if self.invalid_api:\r\n self.logger.warning(\r\n@@ -180,8 +186,6 @@ class NVD_API:\r\n progress.update(task)\r\n progress.update(task, advance=1)\r\n \r\n- else:\r\n- self.total_results = cve_count[\"Total\"] - cve_count[\"Rejected\"]\r\n self.logger.info(f\"Adding {self.total_results} CVE entries\")\r\n \r\n async def validate_nvd_api(self):\r\n@@ -227,7 +231,6 @@ class NVD_API:\r\n self.logger.debug(f\"Response received {response.status}\")\r\n if response.status == 200:\r\n fetched_data = await response.json()\r\n-\r\n if start_index == 0:\r\n # Update total results in case there is discrepancy between NVD dashboard and API\r\n reject_count = (\r\n@@ -238,6 +241,9 @@ class NVD_API:\r\n self.total_results = (\r\n fetched_data[\"totalResults\"] - reject_count\r\n )\r\n+ self.logger.info(\r\n+ f'self.total_results = Total: {fetched_data[\"totalResults\"]} - Rejected: {reject_count}'\r\n+ )\r\n if self.api_version == \"1.0\":\r\n self.all_cve_entries.extend(\r\n fetched_data[\"result\"][\"CVE_Items\"]\r\ndiff --git a/test/test_nvd_api.py b/test/test_nvd_api.py\r\nindex 91cf1fb..e7e2a96 100644\r\n--- a/test/test_nvd_api.py\r\n+++ b/test/test_nvd_api.py\r\n@@ -2,16 +2,26 @@\r\n # SPDX-License-Identifier: GPL-3.0-or-later\r\n \r\n import os\r\n+import types\r\n import shutil\r\n import tempfile\r\n+import contextlib\r\n from datetime import datetime, timedelta\r\n from test.utils import LONG_TESTS\r\n \r\n import pytest\r\n+import aiohttp\r\n+import httptest\r\n+\r\n+import alice.threats.vulns.serve.nvdstyle\r\n \r\n from cve_bin_tool.cvedb import CVEDB\r\n from cve_bin_tool.data_sources import nvd_source\r\n-from cve_bin_tool.nvd_api import NVD_API\r\n+from cve_bin_tool.nvd_api import (\r\n+ NVD_API,\r\n+ FEED as NVD_API_FEED,\r\n+ NVD_CVE_STATUS,\r\n+)\r\n \r\n \r\n class TestNVD_API:\r\n@@ -42,14 +52,40 @@ class TestNVD_API:\r\n LONG_TESTS() != 1 or not os.getenv(\"nvd_api_key\"),\r\n reason=\"NVD tests run only in long tests\",\r\n )\r\n- async def test_total_results_count(self):\r\n+ @pytest.mark.parametrize(\r\n+ \"api_version, feed, stats\",\r\n+ [\r\n+ (\r\n+ \"1.0\",\r\n+ httptest.Server(alice.threats.vulns.serve.nvdstyle.NVDStyleHTTPHandler),\r\n+ httptest.Server(alice.threats.vulns.serve.nvdstyle.NVDStyleHTTPHandler),\r\n+ ),\r\n+ (\r\n+ \"2.0\",\r\n+ httptest.Server(alice.threats.vulns.serve.nvdstyle.NVDStyleHTTPHandler),\r\n+ httptest.Server(alice.threats.vulns.serve.nvdstyle.NVDStyleHTTPHandler),\r\n+ ),\r\n+ ],\r\n+ )\r\n+ async def test_total_results_count(self, api_version, feed, stats):\r\n \"\"\"Total results should be greater than or equal to the current fetched cves\"\"\"\r\n- nvd_api = NVD_API(api_key=os.getenv(\"nvd_api_key\") or \"\")\r\n- await nvd_api.get_nvd_params(\r\n- time_of_last_update=datetime.now() - timedelta(days=2)\r\n- )\r\n- await nvd_api.get()\r\n- assert len(nvd_api.all_cve_entries) >= nvd_api.total_results\r\n+ # TODO alice.nvd.TestHTTPServer will become either\r\n+ # alice.nvd.TestNVDVersion_1_0 or alice.nvd.TestNVDVersion_2_0\r\n+ # lambda *args: alice.nvd.TestHTTPServer(*args, directory=pathlib.Path(__file__).parent)\r\n+ with feed as feed_http_server, stats as stats_http_server:\r\n+ async with aiohttp.ClientSession() as session:\r\n+ nvd_api = NVD_API(\r\n+ feed=feed_http_server.url(),\r\n+ stats=stats_http_server.url(),\r\n+ api_key=os.getenv(\"nvd_api_key\") or \"\",\r\n+ session=session,\r\n+ api_version=api_version,\r\n+ )\r\n+ await nvd_api.get_nvd_params(\r\n+ time_of_last_update=datetime.now() - timedelta(days=2)\r\n+ )\r\n+ await nvd_api.get()\r\n+ assert len(nvd_api.all_cve_entries) >= nvd_api.total_results\r\n \r\n @pytest.mark.asyncio- \r\n\r\n @pytest.mark.skipif(\r\n```\r\n\r\n- Enabling debug logging resulted in the following statement being logged.\r\n - This failure should probably be an `ERROR` level rather than `DEBUG` log.\r\n\r\n```\r\nDEBUG cve_bin_tool.NVD_API:nvd_api.py:274 Failed to connect to NVD list indices must be integers or slices, not str\r\n```\r\n\r\n- Added traceback\r\n- Is NVD2 code needing to index? `fetched_data[\"vulnerabilities\"][index][\"cve\"]`?\r\n\r\n```\r\n\r\nERROR cve_bin_tool.NVD_API:nvd_api.py:276 Pausing requests for 3 seconds\r\nDEBUG cve_bin_tool.NVD_API:nvd_api.py:277 TypeError('list indices must be integers or slices, not str')\r\nTraceback (most recent call last):\r\n File \"/home/pdxjohnny/Documents/python/cve-bin-tool/cve_bin_tool/nvd_api.py\", line 254, in load_nvd_request\r\n fetched_data[\"vulnerabilities\"][\"cve\"]\r\nTypeError: list indices must be integers or slices, not str\r\n```\r\n\r\n- Found and fixed two issues\r\n - intel/cve-bin-tool@afc4a9254683d2a7027bc6574e99d1b0d406d5bc\r\n - fix(nvd_api): Align v2 rejection handling with description schema updates\r\n - intel/cve-bin-tool@46cd825b126dd167158cae4f5e4ac7a32de2e08d\r\n - fix(nvd_api): extend all cve entries from v2 query top level vulnerabilities key\r\n\r\n[![asciicast](https://asciinema.org/a/538712.svg)](https://asciinema.org/a/538712)\r\n\r\n- Pushed 9f0a41ad55bdc7f295c435ebd51db77e3343b915\r\n - alice: threats: vulns: serve: nvdstyle: Fix serving of v2 style CVEs\r\n- Liquid Time-constant Networks Adaptive Online Networks\r\n - https://arxiv.org/pdf/2006.04439v1.pdf\r\n- TODO\r\n - [ ] Finish scorecard demo and intergate into shouldi\r\n - Put this in down the dependency rabbit hole again as one of the things we put in `THREATS.md`\r\n - [ ] `alice threats cicd` (`-keys https://github.com/intel/dffml`)\r\n - [ ] GitHub Actions workflow analysis overlays\r\n - [ ] Look for `runs-on:` and anything not GitHub hosted, then\r\n check `on:` triggers to ensure pull requests aren't being run."
},
{
"body": "## Overlays as Dynamic Context Aware Branches\r\n\r\n> Todo more fanciful tutorial name\r\n\r\nAt a minimum it's like saying when I checkout this branch I want you to cherry pick these commits (semanticly?) from these other branches (and run A/B cross validation of course) and make that a sort of virtual branch where those commits are applied and still tracked as dev or in flight or just alternately sourced versions.\r\n\r\n- References\r\n - https://github.com/intel/dffml/issues/1315#issuecomment-1066971630\r\n - Alice and Bob working on CR0/4\r\n - Examples of virtual branches\r\n - Turning on debug logging while working on NVD style API for use by\r\n cve-bin-tool (and Alice of course).\r\n - [2022-11-18 @pdxjohnny Engineering Logs](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4177910)\r\n- TODO\r\n - Knowledge graph of manifests with SCITT receipts\r\n - Stream of Consciousness\r\n - We share test results of cross validation and virtual branch node additions here\r\n - Alice, Bob, and Eve working with three separate repos\r\n - Cross validation comes into play here"
}
]
},
{
"body": "# 2022-11-19 Engineering Logs",
"replies": [
{
"body": "## 2022-11-19 @pdxjohnny Engineering Logs\r\n\r\n- https://github.com/oras-project/oras-py\r\n - Put it all in the container registry\r\n- https://github.com/OpenChain-Project/Reference-Material/blob/master/Self-Certification/Checklist/Security-Assurance-1.1/en/Security-Assurance-1-1-Checklist-Version-2.md"
}
]
},
{
"body": "# 2022-11-20 Engineering Logs",
"replies": []
},
{
"body": "# 2022-11-21 Engineering Logs",
"replies": [
{
"body": "## 2022-11-21 @pdxjohnny Engineering Logs\r\n\r\n- https://github.com/CrunchyData/pg_eventserv\r\n - `FROM` rebuild chain pdxjohnny/dffml-operations-dockerhub@a738c35199afe82d8a35d97ce16711c6f19785c5\r\n- Going through old repos to look for logcat server\r\n - Found a bunch of code I forgot I wrote and is referenced in Alice thread as deps\r\n - https://github.com/pdxjohnny/webrtcvpn\r\n - https://github.com/pdxjohnny/diffstream\r\n - https://github.com/pdxjohnny/telem/blob/8676810086c732e1a738ce58a6296993f7a87661/client/c/encrypt.c\r\n - https://github.com/pdxjohnny/hack\r\n - Looks like this packs shellcode for `exec` system calls on linux\r\n - [![hack-the-planet](https://img.shields.io/badge/hack%20the-planet-blue)](https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_easter_eggs.md#hack-the-planet-)\r\n - Ref shim\r\n - https://github.com/pdxjohnny/freeze-tool/tree/master/logger\r\n - Stream logging / cross this with telemetry one\r\n - https://github.com/pdxjohnny/video_chat/blob/master/image_video.js#L95\r\n - This comes in handy with VNC over PNG/JPEG etc. when massive hax are required\r\n - https://github.com/pdxjohnny/pysync/blob/master/sync.py\r\n - :grimacing: (cve-bin-tool vlcn-io/cr-sqlite bellow in TODO, been at this a while too)\r\n- https://github.com/oras-project/oras-py\r\n - https://github.com/opencontainers/distribution-spec\r\n - Inventory?\r\n - https://github.com/opencontainers/distribution-spec/blob/main/spec.md#enabling-the-referrers-api\r\n - https://github.com/intel/dffml/pull/1207#discussion_r1026981623\r\n - Stream of Consciousness?\r\n - Might already have websub or equivalent, implementation / ratification status was unclear, dig more investigate Open Architecture encoded (autocodec, multiformat, shim, custom basic, unencoded json, etc.) callback enabling.\r\n - OCI distribution spec all the things\r\n - Python packages\r\n - SBOM\r\n - VEX\r\n - SCITT\r\n\r\n![OCI distribution spec all the things meme](https://user-images.githubusercontent.com/5950433/203143783-b7f9e731-80bd-42c7-b97d-410d62676758.png)\r\n\r\n- Last Friday pushed alice: threats: vulns: serve: nvdstyle: Fix serving of v2 style CVEs - 9f0a41ad55bdc7f295c435ebd51db77e3343b915\r\n - We can now start serving threats!\r\n - Need to finish out the contribution to CVE Binary Tool first\r\n - https://github.com/intel/cve-bin-tool/issues/2334#issuecomment-1315643093\r\n- Found Distributed Android Testing pre-squash real initial webhook commit\r\n - Jul 27, 2015 - 7130e89473f12353f19afb935802b065759be571\r\n - > A webserver to receive json web hooks from gitlab_webhooks\r\n > The hooks are dealt with by calling the corresponding function in\r\n > hooks.py. For example a push is received so the function push in\r\n > hook.py is called and passed the hook data.\r\n - Well friends, it's only been 2,674 days since our first commit down CI lane.\r\n - Next step is we enable offline, offline CI that is, we'll knit together our\r\n Data, Analysis, Control (DAC, aka Digital Analog Converter ;) loop that will\r\n get our software lifecycle analysis going. We're going to look at the supply\r\n chain of the thoughts (adding / using a dependency is a thought, it might also\r\n be a thought you took action on). You are what you EAT and same goes for software!\r\n Our analysis of the supply chains to our trains of thought seen within the\r\n software lifecycle are analogous to the software project as the entity and our\r\n analysis of what it's EATing is an analysis of it's digestion of those thoughts.\r\n Okay I think I wrote this somewhere else and I'm not having success explaining\r\n right now. It's also not so much offline CI as parity across environments, enabling\r\n context (process, workflow, DX) aware application of policy / config / logic.\r\n Aka the intermediate representation and the analysis pattern allow for translation.\r\n As we get more advanced we'll be leveraging (and implementing) our cross domain\r\n conceptual mapping (grep thread) techniques to translate these applications ad-hoc\r\n as feasibility and need allows.\r\n and our EAT wheel will start turning.\r\n - [WIP: Rolling Alice: Coach Alice: You are what you EAT!](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-3885559)\r\n - This offline digestion is important to enable us to give Alice to developers\r\n and help her sit side by side to help them. Today we focus on vulns, and\r\n security patches (version bumps?, `safety` check? - https://intel.github.io/dffml/main/shouldi.html#use-command). Tomorrow might be linting\r\n (`yamllint` for GitHub Actions).\r\n - Using the NIST NVD style API we now have we can begin to issue events over that\r\n stream.\r\n - These events will be the communication of Alice's thoughts and actions, her\r\n development activity. We'll of course incrementally introduce overlays which\r\n increase sophistication of activities and intricacy of communications and\r\n triggers.\r\n- TODO\r\n - [ ] For the meeting recording to markdown / rST we need to also screenshot if there is a deck presented\r\n - [ ] Contribute NVDStyle pieces to cve-bin-tool as needed for https://github.com/intel/cve-bin-tool/issues/2334#issuecomment-1315643093\r\n - [ ] SCITT receipts for each CVE (attached as separate record? attached within? wrapped?)\r\n - [ ] [download_nvd](https://github.com/pdxjohnny/download_nvd) but somehow hybridized with https://github.com/vlcn-io/cr-sqlite for conflict free resolution deltas on the CVE Binary Database.\r\n - Or maybe go the bzdiff route\r\n - [ ] Finish scorecard demo and intergate into shouldi\r\n - Put this in down the dependency rabbit hole again as one of the things we put in `THREATS.md`\r\n - [ ] `alice threats cicd` (`-keys https://github.com/intel/dffml`)\r\n - [ ] GitHub Actions workflow analysis overlays\r\n - [ ] Look for `runs-on:` and anything not GitHub hosted, then\r\n check `on:` triggers to ensure pull requests aren't being run.\r\n - https://github.com/intel/dffml/issues/1422\r\n - [ ] Output to JSON source (so long as we derive from `RunRecordSet` we'll be done with this)\\\r\n - [ ] Have NVDStyle server take source as input/config so that we can point it at the discovered vulns\r\n - [ ] Track https://github.com/intel/cve-bin-tool/issues/2320#issuecomment-1303174689\r\n in relation to `policy.yml`\r\n - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice/0000_architecting_alice#what-is-alice\r\n - [ ] `alice please log todos -source static=json dynamic=nvdstyle`\r\n - [ ] Implement source for reading from NVDSytle API (op source for single function prototype?)\r\n - [ ] Enable creation of TODOs by overlaying operations which take the feature data as inputs (use dfpreprocess?)"
}
]
},
{
"body": "# 2022-11-22 Engineering Logs",
"replies": [
{
"body": "## 2022-11-22 @pdxjohnny Engineering Logs\r\n\r\n- https://www.science.org/doi/10.1126/science.ade9097\r\n - Some people did the diplomacy civ style thing\r\n - grep `docs/arch/alice/discussion` thread\r\n - https://youtu.be/u5192bvUS7k\r\n - https://twitter.com/ml_perception/status/1595070353063424000\r\n- Rebased in cve-bin-tool@main to [nvd_api_v2_tests](https://github.com/pdxjohnny/cve-bin-tool/compare/nvd_api_v2_tests) in pursuit of https://github.com/intel/cve-bin-tool/issues/2334\r\n\r\n[![asciicast](https://asciinema.org/a/539495.svg)](https://asciinema.org/a/539495)\r\n\r\n- https://github.com/OR13/didme.me/issues/18\r\n - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0007_an_image.md\r\n- https://twitter.com/tlodderstedt/status/1592641414504280064\r\n - https://openid.net/openid4vc/\r\n - OpenID for Verifiable Credentials (OpenID4VC)\r\n - https://www.slideshare.net/TorstenLodderstedt/openid-for-verifiable-credentials-iiw-35\r\n - https://openid.bitbucket.io/connect/openid-connect-self-issued-v2-1_0.html#name-sharing-claims-eg-vc-from-s\r\n - The following quotes are applicable to our DFFML CI/CD setup.\r\n We care about static analysis results and stuff (`alice shouldi`),\r\n for example auth of our runners (grep OSS scanning) and artifacts\r\n to push data to `data.chadig|nahdig.com` and then to the OpenSSF.\r\n - Ideally our data structures are self identifying and authing (UCAN, ATP, etc.)\r\n - We still need bridges into existing identity and auth infra\r\n - [DID + HSM Supply Chain Security Mitigation Option](https://github.com/intel/dffml/tree/alice/docs/arch/0007-A-GitHub-Public-Bey-and-TPM-Based-Supply-Chain-Security-Mitigation-Option.rst)\r\n - https://www.youtube.com/clip/Ugkxf-HtFY6sR_-EnGGksIik8eyAKQACE0_n?list=PLtzAOVTpO2jaHsS4o-sDzDyHEug-1KRbK\r\n - Vision: Reducing Overhead via Thought Communication Protocol\r\n - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0000_architecting_alice/0005_stream_of_consciousness.md\r\n - [2022-10-15 Engineering Logs: Rolling Alice: Architecting Alice: Thought Communication Protocol Case Study: DFFML](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-3883683)\r\n - The video this was clipped from was linked in the commit message https://github.com/intel/dffml/commit/fc42d5bc756b96c36d14e7f620f9d37bc5e4a7fd\r\n - Found the previous stream of consciousness aligned with this. I had been meaning to look for it, we'll be back in this train of thought when we get to didme.me \"An Image\" python implementation.\r\n - https://www.youtube.com/watch?v=9y7d3RsXkbA&list=PLtzAOVTpO2jaHsS4o-sDzDyHEug-1KRbK\r\n - > [2.4. ](https://openid.bitbucket.io/connect/openid-connect-self-issued-v2-1_0.html#section-2.4)[Sharing Claims (e.g. VC) from Several Issuers in One Transaction](https://openid.bitbucket.io/connect/openid-connect-self-issued-v2-1_0.html#name-sharing-claims-eg-vc-from-s)\r\nWhen End-Users apply to open a banking account online, in most countries, they are required to submit scanned versions of the required documents. These documents are usually issued by different authorities, and are hard to verify in a digital form. A Self-issued OP directly representing the End-User may have access to a greater set of such information for example in the format of Verifiable Credentials, while a traditional OP may not have a business relationship which enables access to such a breadth of information. Self-Issued OPs could aggregate claims from multiple sources, potentially in multiple formats, then release them within a single transaction to a Relying Party. The Relying Party can then verify the authenticity of the information to make the necessary business decisions.\r\n - https://openid.net/wordpress-content/uploads/2022/06/OIDF-Whitepaper_OpenID-for-Verifiable-Credentials-V2_2022-06-23.pdf\r\n - > OpenID Connect, a protocol that enables deployment of federated Identity at scale, was built with User-Centricity in mind. The protocol is designed so that the Identity Provider releases the claims about the End-User to the Relying Party after obtaining consent directly from an EndUser. This enables Identity Providers to enforce consent as the lawful basis for the presentation based on the Relying Party\u2019s privacy notice. The protocol also enables two kinds of Identity Providers, those controlled by the End-Users and those provided by the third parties. Now, User-Centricity is evolving to grant the End-Users more control, privacy and portability over their identity information. Using OpenID for Verifiable Credentials protocols, the End-Users can now directly present identity information to the Relying Parties. This empowers the EndUsers to retain more control over the critical decisions when and what information they are sharing. Furthermore, the End-Users\u2019 privacy is preserved since Identity Providers no longer know what activity the End-Users are performing at which Relying Party. End-Users also gain portability of their identity information because it can now be presented to the Relying Parties who do not have a federated relationship with the Credential Issuer. Then the technical details of OpenID4VC are presented, alongside an explanation of certain decision choices that were made, such as why OpenID Connect, and OAuth 2.0 are well-suited as basis for presentation and issuance protocols for verifiable credentials. Finally, the whitepaper concludes by reiterating the importance of making choices for standards that meet certain use-cases in order to realize a globally interoperable verifiable credentials ecosystem. Achieving large-scale adoption of verifiable credentials will be \"by Evolution, not by Revolution\". The identity community can more swiftly empower people, and government authorities developing identity infrastructure and policies, by adopting standards like OpenID4VC that facilitate convergence and interoperation of existing and emerging standards.\r\n- https://vos.openlinksw.com/owiki/wiki/VOS/VOSIntro\r\n- https://github.com/OpenLinkSoftware/OSDS_extension\r\n- https://hobbit-project.github.io/\r\n- https://youtube.com/clip/Ugkxf-HtFY6sR_-EnGGksIik8eyAKQACE0_n\r\n - Vision: Reducing Overhead via Thought Communication Protocol\r\n- https://cloud.hasura.io/public/graphiql?header=content-type:application/json&endpoint=https://api.graphql.jobs\r\n- We're working on fixing the CI right now\r\n - The vuln serving `NVDStyle` is our base for comms right now (think manifests)\r\n - https://github.com/intel/dffml/blob/alice/docs/arch/0008-Manifest.md\r\n - This is how we will be facilitating Continuous Delivery.\r\n - Open source projects will implement vuln stream handling, we are\r\n hopefully piggy backing our `FROM` rebuild chain and so forth on top,\r\n once again, we're always looking for reliable resilient ubiquitously\r\n available comms. Reuse, reuse, reuse.\r\n- https://github.com/intel/dffml/issues/1421\r\n- Found some meetups to share Alice with\r\n- https://www.meetup.com/rainsec/events/289349686/\r\n - > RainSec - PDX Information Security Meetup: RainSec is an informal group of like-minded security professionals who meet to network and discuss topics of interest in a non-work, non-vendor setting. While our target audience is experienced information security professionals, this is a public event open to any interested parties. If you have a friend or colleague who might benefit, please pass an invite along.\r\n- https://www.meetup.com/hardware-happy-hour-3h-portland/events/289759128/\r\n - > Hardware Happy Hour is an informal way to socialize, show off your projects, and talk about the world of hardware.\r\n- https://www.meetup.com/ctrl-h/events/282093316/\r\n - > Dorkbot PDX (Virtual): Part virtual hackathon, part virtual geek social, these virtual biweekly meetings are a time for you to virtually join others for insight, inspiration or just insanity.\r\n - https://app.gather.town/app/1KLgyeL4yGzBeCAL/dorkbot\r\n- https://app.gather.town/app\r\n - UX wow. Landing into anon profile allowing actions / creation. Love it.\r\n- https://mastodon.online/@rjh/109388793314837723\r\n - > nsrllookup.com is back online after a long pandemic-related hiatus. If you need to sort wheat from chaff for large volumes of data, try removing every piece of data in NIST's collection.\r\n >\r\n > Many thanks to [@warthog9](https://mastodon.social/@warthog9@social.afront.org) for hosting nsrllookup.com all these years. :)\r\n - https://github.com/rjhansen/nsrlsvr\r\n - We should hybridize this with SCITT recpeits returned for the content addresses, let's use SHA384 or something stronger\r\n - https://mastodon.online/@rjh/109388812626470845\r\n - Let's use this hybrid with the NVDStyle API, or perhaps let's wait (8 minutes ago, Chaos smiles on us again ;) Really we should stick with OCI registry on our first pass here.\r\n - > Work on version 2 of nsrllookup is well underway. When I originally developed it, I elected to write my own very simple wire protocol. Although it still works fine, it means whenever I want to write a binding for a new programming language I have to rewrite the parser-generator.\r\n >\r\n > Version 2, currently underway, moves to gRPC. This should make it much easier to integrate with third-party tools like Autopsy.\r\n- Random LOL\r\n - Architecting Alice: Volume 0: Context: Part 1: Where are we: YouTube's automated captions: \"Intro, The Plan, Alice, Chaos, Nested Virtualization\"\r\n - Hit the nail on the head with that one ;P\r\n\r\n[![Architecting Alice: Volume 0: Context: Part 1: Where are we: YouTube's automated captions LOL: \"Intro, The Plan, Alice, Chaos, Nested Virtualization\"](https://user-images.githubusercontent.com/5950433/203405118-91f1d2d8-a9f7-42e8-a468-d984e7f7d7ae.png)](https://www.youtube.com/watch?v=dI1oGv7K21A&list=PLtzAOVTpO2jaHsS4o-sDzDyHEug-1KRbK)\r\n\r\n- https://docs.velociraptor.app/\r\n- https://www.thc.org/segfault/\r\n - https://github.com/hackerschoice/segfault\r\n - Stoooooked\r\n- https://www.thc.org\r\n- https://www.gsocket.io/\r\n - Doooooooooope\r\n - Let's see if there's a cross with DERP here, Wireguard is probably involved.\r\n - > [![gsocket-asciicast](https://asciinema.org/a/lL94Vsjz8JM0hCjnfKM173Ong.svg)](https://asciinema.org/a/lL94Vsjz8JM0hCjnfKM173Ong)\r\n- https://github.com/vanhauser-thc/\r\n- TODO\r\n - [ ] Finish https://github.com/intel/cve-bin-tool/issues/2334\r\n - https://github.com/intel/cve-bin-tool/pull/2384"
}
]
},
{
"body": "# 2022-11-23 Engineering Logs",
"replies": [
{
"body": "## 2022-11-23 @pdxjohnny Engineering Logs\r\n\r\n- [alice: threats: cicd: github: workflow: Check for curl -k #1423](https://github.com/intel/dffml/issues/1423)\r\n- [alice: threats: cicd: github: workflow: Guess at if input should be passed as secret #1424](https://github.co/intel/dffml/issues/1424)\r\n- Alice, what entities are working on aligned trains of thought\r\n - Assumes current context\r\n - Could also specify train of thought via DID or petname or shortref or whatever\r\n - Overlap in architecture heatmaps\r\n - Overlap in conceptual upleveling\r\n - Add in related todos (GitHub issues Anthony has been working on NVD APIv2 related)\r\n - Graphs are fun\r\n - [WIP Rolling Alice: ?: ? - Working Title: Overlays as Dynamic Context Aware Branches](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-4180716)\r\n - [2022-10-15 Engineering Logs: Rolling Alice: Architecting Alice: Thought Communication Protocol Case Study: DFFML](https://github.com/intel/dffml/discussions/1406?sort=new#discussioncomment-3883683)\r\n\r\n![meme-anarchy-elmo-knowledge-graphs-for-the-Chaos-God](https://user-images.githubusercontent.com/5950433/203634346-111c884d-0f95-4066-addf-dbfbaeda4910.png)\r\n\r\n```console\r\n$ git clone https://github.com/pdxjohnny/cve-bin-tool -b nvd_api_v2_tests\r\n$ cd cve-bin-tool\r\n$ alice please tell me who is working on aligned trains of thought\r\nanthonyharrison\r\n$ alice please create state of the art virtual branch from those contributors and myself\r\n... runs cherry-picking cross validation / A/B feature flag testing the commits ...\r\n... cached state from team active dev sessions, CI, etc. via active overlays ...\r\n... which means this could be no-exec, pure static eval and creation based of ...\r\n... cherry-picks and their graph linked test results, see Zephyr recent stuff ...\r\n$ echo As mentioned
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment