Skip to content

Instantly share code, notes, and snippets.

@pdxjohnny
Created November 23, 2022 22:35
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save pdxjohnny/928c6ae9bd757940299732c5fcb4c8ac to your computer and use it in GitHub Desktop.
Save pdxjohnny/928c6ae9bd757940299732c5fcb4c8ac to your computer and use it in GitHub Desktop.
This file has been truncated, but you can view the full file.
{
"data": {
"repository": {
"pinnedDiscussions": {
"nodes": [
{
"discussion": {
"title": "Alice Engineering Comms",
"body": "These are the engineering logs of entities working on Alice. If you\r\nwork on [Alice](https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/) please use this thread as a way to communicate to others\r\nwhat you are working on. Each day has a log entry. Comment with your\r\nthoughts, activities, planning, etc. related to the development of\r\nAlice, our open source artificial general intelligence.\r\n\r\nThis thread is used as a communication mechanism for engineers so that\r\nothers can have full context around why entities did what they did\r\nduring their development process. This development lifecycle data helps\r\nus understand more about why decisions were made when we re-read the\r\ncode in the future (via cross referencing commit dates with dates in\r\nengineering logs). In this way we facilitate communication across\r\nboth time and space! Simply by writing things down. We live an an\r\nasynchronous world. Let's communicate like it.\r\n\r\nWe are collaboratively documenting strategy and implementation as\r\nliving documentation to help community communicate amongst itself\r\nand facilitate sync with potential users / other communities /\r\naligned workstreams.\r\n\r\n- References\r\n - https://gist.github.com/pdxjohnny/07b8c7b4a9e05579921aa3cc8aed4866\r\n - Progress Report Transcripts\r\n - https://github.com/intel/dffml/tree/alice/entities/alice/\r\n - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice\r\n - https://github.com/intel/dffml/pull/1401\r\n - https://github.com/intel/dffml/pull/1207\r\n - https://github.com/intel/dffml/pull/1061\r\n - #1315\r\n - Aligned threads elsewhere\r\n - https://mastodon.social/@pdxjohnny/109320563491316354\r\n - https://twitter.com/pdxjohnny/status/1522345950013845504\r\n - async comms / asynchronous communications\r\n - https://twitter.com/SergioRocks/status/1579110239408095232\r\n\r\n## Engineering Log Process\r\n\r\n- Alice, every day at 7 AM in Portland's timezone create a system context (the tick)\r\n - Merge with existing system context looked up from querying this thread if exists\r\n - In the future we will Alice will create and update system contexts.\r\n - We'll start with each day, then move to a week, then a fortnight, then 2 fortnights.\r\n - She'll parse the markdown document to rebuild the system context as if it's cached\r\n right before it would be synthesized to markdown, we then run updates and trigger\r\n update of the comment body. Eventually we won't use GitHub and just DID based stuff.\r\n We'll treat these all as trains of thought / chains of system contexts / state of the\r\n art fields.\r\n - Take a set of system contexts as training data\r\n - The system context which we visualize as a line dropped from the peak of a pyramid\r\n where it falls through the base.\r\n - We use cross domain conceptual mapping to align system contexts in a similar direction\r\n and drop ones which do are unhelpful, do not make the classification for \"good\"\r\n - What remains from our circular graph is a pyramid with the correct decisions\r\n (per prioritizer) \r\n - This line represents the \"state of the art\", the remembered (direct lookup) or\r\n predicted/inferred system contexts along this line are well rounded examples of\r\n where the field is headed, per upstream and overlay defined strategic plans\r\n and strategic principles\r\n - References:\r\n - `$ git grep -C 5 -i principle -- docs/arch/`\r\n - Source: https://github.com/intel/dffml/discussions/1369\r\n - Inputs\r\n - `date`\r\n - Type: `Union[str, Date]`\r\n - Example: `2022-07-18`\r\n - Default: `did:oa:architype:current-date`\r\n - Add yesterdays unfinished TODO items to this train of though with the \r\n - Create a document (docutils?)\r\n - Make the top level header `date` with \"Notes\" appended\r\n - Collect all previous days TODOs from within the individual entity comments within the thread for the days comment (the team summary for that day)\r\n - Drop any completed or struck through TODOs\r\n - Output a list item \"TODO\" with the underlying bullets with handle prepended and then the TODO contents\r\n - Create comments for individuals after this the current system context is posted and we have a live handle to it to reply with each individuals markdown document.\r\n - Synthesis the document to markdown (there is a python lib out there that can do docutils to md, can't remember the name right now)\r\n - Upsert comment in thread",
"category": {
"name": "General"
},
"comments": {
"pageInfo": {
"endCursor": "Y3Vyc29yOnYyOpK5MjAyMi0xMS0xNlQyMzowODozNS0wODowMM4AP4sA",
"hasNextPage": true
},
"totalCount": 96,
"nodes": [
{
"body": "# 2022-07-18 Engineering Logs\r\n\r\n- TODO\r\n - [x] @aliceoa, @pdxjohnny: Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] @pdxjohnny: Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting",
"createdAt": "2022-07-18T16:28:51Z",
"userContentEdits": {
"pageInfo": {
"endCursor": "Y3Vyc29yOnYyOpHOABNWFA==",
"hasNextPage": false
},
"totalCount": 3,
"nodes": [
{
"diff": "# 2022-07-18 Engineering Logs\r\n\r\n- TODO\r\n - [x] @aliceoa, @pdxjohnny: Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] @pdxjohnny: Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting",
"editedAt": "2022-07-22T19:12:23Z"
},
{
"diff": "# 2022-07-18 Engineering Logs\r\n\r\n- TODO\r\n - [ ] @aliceoa, @pdxjohnny: Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] @pdxjohnny: Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting",
"editedAt": "2022-07-18T16:29:58Z"
},
{
"diff": "# 2022-07-18 Engineering Logs\r\n\r\n- TODO\r\n - [ ] @aliceoa, @pdxjohnny: Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] @pdxjohnny: Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n- References\r\n - https://medium.com/51nodes/decentralized-schema-registry-aa662b8db12b",
"editedAt": "2022-07-18T16:28:51Z"
}
]
},
"replies": {
"pageInfo": {
"endCursor": "Y3Vyc29yOnYyOpK5MjAyMi0wNy0xOFQwOToyOTo1NC0wNzowMM4AMGwI",
"hasNextPage": false
},
"totalCount": 1,
"nodes": [
{
"body": "## 2022-07-18 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n- Future\r\n - Engage with Loihi community\r\n - See what we can do here, not sure yet, play with system context / mitigation inference in devcloud?\r\n - https://www.intel.com/content/www/us/en/research/neuromorphic-community.html\r\n - https://download.intel.com/newsroom/2021/new-technologies/neuromorphic-computing-loihi-2-brief.pdf\r\n - https://www.intel.com/content/www/us/en/newsroom/news/intel-unveils-neuromorphic-loihi-2-lava-software.html\r\n - \r\n- References\r\n - https://medium.com/51nodes/decentralized-schema-registry-aa662b8db12b\r\n - https://www.microsoft.com/security/blog/2021/10/06/microsofts-5-guiding-principles-for-decentralized-identities/\r\n - https://ariadne.space/2022/07/17/how-efficient-can-cat1-be/\r\n - Usage of splice\r\n - https://github.com/NVlabs/eg3d\r\n - Seeing from different perspectives, inter domain conceptual mapping, encoded sysctxs alternate mitigations\r\n - https://github.com/robmarkcole/satellite-image-deep-learning\r\n - Knitting together system contexts (Alice could use for integration of various architectures)",
"createdAt": "2022-07-18T16:29:54Z",
"userContentEdits": {
"pageInfo": {
"endCursor": "Y3Vyc29yOnYyOpHOABNWFg==",
"hasNextPage": false
},
"totalCount": 9,
"nodes": [
{
"diff": "## 2022-07-18 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n- Future\r\n - Engage with Loihi community\r\n - See what we can do here, not sure yet, play with system context / mitigation inference in devcloud?\r\n - https://www.intel.com/content/www/us/en/research/neuromorphic-community.html\r\n - https://download.intel.com/newsroom/2021/new-technologies/neuromorphic-computing-loihi-2-brief.pdf\r\n - https://www.intel.com/content/www/us/en/newsroom/news/intel-unveils-neuromorphic-loihi-2-lava-software.html\r\n - \r\n- References\r\n - https://medium.com/51nodes/decentralized-schema-registry-aa662b8db12b\r\n - https://www.microsoft.com/security/blog/2021/10/06/microsofts-5-guiding-principles-for-decentralized-identities/\r\n - https://ariadne.space/2022/07/17/how-efficient-can-cat1-be/\r\n - Usage of splice\r\n - https://github.com/NVlabs/eg3d\r\n - Seeing from different perspectives, inter domain conceptual mapping, encoded sysctxs alternate mitigations\r\n - https://github.com/robmarkcole/satellite-image-deep-learning\r\n - Knitting together system contexts (Alice could use for integration of various architectures)",
"editedAt": "2022-07-22T19:12:26Z"
},
{
"diff": "## 2022-07-18 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n- Future\r\n - Engage with Loihi community\r\n - See what we can do here, not sure yet, play with system context / mitigation inference in devcloud?\r\n - https://www.intel.com/content/www/us/en/research/neuromorphic-community.html\r\n - https://download.intel.com/newsroom/2021/new-technologies/neuromorphic-computing-loihi-2-brief.pdf\r\n - https://www.intel.com/content/www/us/en/newsroom/news/intel-unveils-neuromorphic-loihi-2-lava-software.html\r\n - \r\n- References\r\n - https://medium.com/51nodes/decentralized-schema-registry-aa662b8db12b\r\n - https://www.microsoft.com/security/blog/2021/10/06/microsofts-5-guiding-principles-for-decentralized-identities/\r\n - https://ariadne.space/2022/07/17/how-efficient-can-cat1-be/\r\n - Usage of splice\r\n - https://github.com/NVlabs/eg3d\r\n - Seeing from different perspectives, inter domain conceptual mapping, encoded sysctxs alternate mitigations\r\n - https://github.com/robmarkcole/satellite-image-deep-learning\r\n - Knitting together system contexts (Alice could use for integration of various architectures)",
"editedAt": "2022-07-18T21:11:10Z"
},
{
"diff": "## 2022-07-18 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n- Future\r\n - Engage with Loihi community\r\n - See what we can do here, not sure yet, play with system context / mitigation inference in devcloud?\r\n - https://www.intel.com/content/www/us/en/research/neuromorphic-community.html\r\n - https://download.intel.com/newsroom/2021/new-technologies/neuromorphic-computing-loihi-2-brief.pdf\r\n - https://www.intel.com/content/www/us/en/newsroom/news/intel-unveils-neuromorphic-loihi-2-lava-software.html\r\n - \r\n- References\r\n - https://medium.com/51nodes/decentralized-schema-registry-aa662b8db12b\r\n - https://www.microsoft.com/security/blog/2021/10/06/microsofts-5-guiding-principles-for-decentralized-identities/\r\n - https://ariadne.space/2022/07/17/how-efficient-can-cat1-be/\r\n - Usage of splice\r\n - https://github.com/NVlabs/eg3d\r\n - Seeing from different perspectives, inter domain conceptual mapping, encoded sysctxs alternate mitigations",
"editedAt": "2022-07-18T20:55:24Z"
},
{
"diff": "## 2022-07-18 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n- Future\r\n - Engage with Loihi community\r\n - See what we can do here, not sure yet, play with system context / mitigation inference in devcloud?\r\n - https://www.intel.com/content/www/us/en/research/neuromorphic-community.html\r\n - https://download.intel.com/newsroom/2021/new-technologies/neuromorphic-computing-loihi-2-brief.pdf\r\n - https://www.intel.com/content/www/us/en/newsroom/news/intel-unveils-neuromorphic-loihi-2-lava-software.html\r\n - \r\n- References\r\n - https://medium.com/51nodes/decentralized-schema-registry-aa662b8db12b\r\n - https://www.microsoft.com/security/blog/2021/10/06/microsofts-5-guiding-principles-for-decentralized-identities/\r\n - https://ariadne.space/2022/07/17/how-efficient-can-cat1-be/\r\n - Usage of splice\r\n - https://github.com/NVlabs/eg3d\r\n - Seeing from different perspectives, encoded sysctxs alternate mitigations",
"editedAt": "2022-07-18T20:55:08Z"
},
{
"diff": "## 2022-07-18 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n- Future\r\n - Engage with Loihi community\r\n - See what we can do here, not sure yet, play with system context / mitigation inference in devcloud?\r\n - https://www.intel.com/content/www/us/en/research/neuromorphic-community.html\r\n - https://download.intel.com/newsroom/2021/new-technologies/neuromorphic-computing-loihi-2-brief.pdf\r\n - https://www.intel.com/content/www/us/en/newsroom/news/intel-unveils-neuromorphic-loihi-2-lava-software.html\r\n- References\r\n - https://medium.com/51nodes/decentralized-schema-registry-aa662b8db12b\r\n - https://www.microsoft.com/security/blog/2021/10/06/microsofts-5-guiding-principles-for-decentralized-identities/\r\n - https://ariadne.space/2022/07/17/how-efficient-can-cat1-be/\r\n - Usage of splice",
"editedAt": "2022-07-18T18:46:29Z"
},
{
"diff": "## 2022-07-18 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n- Future\r\n - Engage with Loihi community\r\n - See what we can do here, not sure yet, play with system context / mitigation inference in devcloud?\r\n - https://www.intel.com/content/www/us/en/research/neuromorphic-community.html\r\n - https://download.intel.com/newsroom/2021/new-technologies/neuromorphic-computing-loihi-2-brief.pdf\r\n - https://www.intel.com/content/www/us/en/newsroom/news/intel-unveils-neuromorphic-loihi-2-lava-software.html\r\n- References\r\n - https://medium.com/51nodes/decentralized-schema-registry-aa662b8db12b\r\n - https://www.microsoft.com/security/blog/2021/10/06/microsofts-5-guiding-principles-for-decentralized-identities/",
"editedAt": "2022-07-18T17:56:17Z"
},
{
"diff": "## 2022-07-18 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n- Future\r\n - Engage with Loihi community\r\n - See what we can do here, not sure yet, play with system context / mitigation inference in devcloud?\r\n - https://www.intel.com/content/www/us/en/research/neuromorphic-community.html\r\n - https://download.intel.com/newsroom/2021/new-technologies/neuromorphic-computing-loihi-2-brief.pdf\r\n - https://www.intel.com/content/www/us/en/newsroom/news/intel-unveils-neuromorphic-loihi-2-lava-software.html\r\n- References\r\n - https://medium.com/51nodes/decentralized-schema-registry-aa662b8db12b",
"editedAt": "2022-07-18T17:19:05Z"
},
{
"diff": "## 2022-07-18 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n- References\r\n - https://medium.com/51nodes/decentralized-schema-registry-aa662b8db12b",
"editedAt": "2022-07-18T16:30:08Z"
},
{
"diff": "# 2022-07-18 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n- References\r\n - https://medium.com/51nodes/decentralized-schema-registry-aa662b8db12b",
"editedAt": "2022-07-18T16:29:54Z"
}
]
}
}
]
}
},
{
"body": "# 2022-07-19 Engineering Logs\r\n\r\n- TODO\r\n - [x] @aliceoa, @pdxjohnny: Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] @pdxjohnny: Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting",
"createdAt": "2022-07-19T15:38:19Z",
"userContentEdits": {
"pageInfo": {
"endCursor": "Y3Vyc29yOnYyOpHOABOOkw==",
"hasNextPage": false
},
"totalCount": 2,
"nodes": [
{
"diff": "# 2022-07-19 Engineering Logs\r\n\r\n- TODO\r\n - [x] @aliceoa, @pdxjohnny: Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] @pdxjohnny: Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting",
"editedAt": "2022-07-22T19:12:17Z"
},
{
"diff": "# 2022-07-19 Engineering Logs\r\n\r\n- TODO\r\n - [ ] @aliceoa, @pdxjohnny: Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] @pdxjohnny: Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting",
"editedAt": "2022-07-19T15:38:19Z"
}
]
},
"replies": {
"pageInfo": {
"endCursor": "Y3Vyc29yOnYyOpK5MjAyMi0wNy0xOVQwODo0MToxMy0wNzowMM4AMI2E",
"hasNextPage": false
},
"totalCount": 1,
"nodes": [
{
"body": "## 2022-07-19 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n- Some good spdx DAG stuff on how we turn source into build SBOM wise\r\n - https://lists.spdx.org/g/Spdx-tech/message/4659\r\n- References\r\n - https://github.com/nsmith5/rekor-sidekick\r\n - > Rekor transparency log monitoring and alerting\r\n - Leverages Open Policy Agent\r\n - Found while looking at Open Policy Agent to see if we can serialize to JSON.\r\n - Possibly use to facilitate our downstream validation\r\n - https://github.com/intel/dffml/issues/1315\r\n - https://mermaid-js.github.io/mermaid/#/c4c\r\n - Mermaid is working on native https://c4model.com support!\r\n - W3C approves DIDs!\r\n - https://blog.avast.com/dids-approved-w3c\r\n - https://www.w3.org/blog/news/archives/9618\r\n - https://www.w3.org/2022/07/pressrelease-did-rec.html.en\r\n - https://twitter.com/w3c/status/1549368259878723585/retweets/with_comments\r\n\r\n> \"Intel Corporation congratulates the DID Working Group on Decentralized Identifier (DID) 1.0 reaching W3C Recommendation status.\r\n> DID provides a framework to unify and consolidate multiple evolving identity systems. Consolidating identity systems within a single framework is useful for validating the authenticity of information and preserving its integrity as it is moved and processed among cloud, edge, and client systems. This potentially increases the capabilities of the Web to connect and unify multiple sources of information.\r\n\r\nThe continuing evolution of this work will be key to the development of new technologies in the fields of supply chain management and Internet of Things (IoT) devices and services. For example, a Birds of a Feather (BOF) discussion group at IETF [Supply Chain Integrity, Transparency, and Trust (SCITT)](https://datatracker.ietf.org/doc/bofreq-birkholz-supply-chain-integrity-transparency-and-trust-scitt/) has already highlighted DID as a useful approach in providing much needed structure for exchanging information through the supply chain, and the Web of Things (WoT) WG is planning to support DID for identifying and discovering IoT devices and metadata.\r\n\r\nIntel Corporation supports this work and encourages the DID Working Group to continue working towards the convergence of widely implemented and adopted standardized best practices for identity in its next charter.\"\r\n\r\nEric Siow, Web Standards and Ecosystem Strategies Director, Intel Corporation\r\n\r\n\r\n\r\n\r\n- https://blog.devgenius.io/top-10-architecture-characteristics-non-functional-requirements-with-cheatsheat-7ad14bbb0a9b\r\n\r\n![image](https://user-images.githubusercontent.com/5950433/179842612-5fb02fb5-1f26-4cb4-af0d-d375b1134ace.png)\r\n\r\n- For Vol 3, on mind control\r\n - https://bigthink.com/the-present/sophists/\r\n\r\n---\r\n\r\nUnsent to Mike Scovetta: michael.scovetta (at) microsoft.com\r\n\r\nHi Mike,\r\n\r\nHope you\u2019ve been well. It\u2019s John from Intel. Thanks again to you and the team for welcoming me to the Identifying Security Threats working group meeting [2021-02-18](https://docs.google.com/document/d/1AfI0S6VjBCO0ZkULCYZGHuzzW8TPqO3zYxRjzmKvUB4/edit#heading=h.mfw2bj5svu9u) last year. We talked a bit about how Intel had a similar effort. I then changed roles hoping to get more involved with OpenSSF but then ended up getting told to be uninvolved. Now I switched roles again and involvement is in scope! Sorry for the lapse in communications.\r\n\r\nI periodically check the minutes so I joined today and asked about the \"Alpha-Omega\" project from last week\u2019s minutes which I then did some research on. We just started what looks to me to be an aligned project, coincidentally named Alice Omega Alpha: https://github.com/intel/dffml/tree/alice/entities/alice\r\n\r\nIt looks to me like Alice's mission to proactively enable developers and organizations to deliver organizationally context aware, adaptive secure by default best practices to teams aligns with project Alpha-Omega\u2019s goals. Alice is a part of Intel's software supply chain security strategy and her initial focus is on securing Intel\u2019s software supply chain and facilitating adoption of InnerSource practices.\r\n\r\nAlice is the nickname for both the entity and the architecture, the Open Architecture, which is a methodology for interpretation of existing well established, formats, protocols, and other domain specific representations of architecture. What we end up with is some JSON, YAML, or other blob of structured data that we can use to build cross language tooling focused more on policy and intent, incorporating data from arbitrary sources to create a holistic picture of software across dependency boundaries by focusing on threat models.\r\n\r\nAlice will be doing scans of open source projects and we\u2019d still love to collaborate to contribute metrics to the OpenSSF metrics database, we can easily have her shoot applicable metrics off to that DB. We\u2019ve also been looking at fusing VEX and DIDs to facilitate distributed vulnerability disclosure and patch distribution.\r\n\r\n---\r\n\r\nUnset to Jun Takei: jun.takei (at) intel.com\r\n\r\nThe W3C today issued the recommendation on DIDs. Jun I saw from Eric's\r\ncomment on the press release that the SCITT working group has an SCITT\r\nArchitecture which DID's might be suitable for.\r\n\r\nThe DFFML community is working on a project called Alice\r\nhttps://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice\r\nshe is intended to be a developer helper. She's also the way we data mine\r\nsource repositories (etc.).\r\n\r\nShe\u2019s open source with a plugin system (\"overlays\") so we can write open source code\r\nand then just add our internal integrations. This system relies on an abstraction of\r\narchitecture known as the Open Architecture. The Open Architecture, also known as\r\nAlice, is a methodology for interpreting directed graphs of domain specific architectures.\r\nAlice is the name we give both the entity and the architecture. We are hoping to\r\nhave Alice store and process information backed by directed graphs of DIDs, SBOMs, and\r\nVEX info primarily. This sounds very similar to the SCITT Architecture. We would love to\r\ncollaborate with you both to help make SCITT a success. Alice is focused on analysis of\r\nour software supply chain so as to ensure we conform to best practices. We would like\r\nthe analysis to serialize directly to an industry best practice format for that as well,\r\nwhich SCITT looks to be.\r\n\r\nTo increase the level of trust in our supply chain we would like to ensure interoperability\r\nup and down the stack. Ned is involved in the DICE space and communicated to me\r\nthat \r\n\r\nPlease let us know where things are at with your involvement with DIDs and SCITT so we\r\ncan be in sync with Intel's involvement and direction in this space. Please also let us know\r\nhow we could best establish an ongoing line of communication so as to build off and\r\ncontribute to where possible the work you're involved in.\r\n\r\nReferences:\r\n- https://datatracker.ietf.org/doc/html/draft-birkholz-scitt-architecture\r\n- https://www.w3.org/2022/07/pressrelease-did-rec.html.en\r\n- https://docs.microsoft.com/en-us/azure/confidential-ledger/architecture\r\n\r\n---\r\n\r\nUnsent\r\n\r\nTo: Jun and Mike and Dan lorenc.d (at) gmail.com\r\n\r\nI commented on the OpenSFF Stream 8 doc recommending that DIDs be looked at\r\nas a way to exchange vulnerability information.\r\n\r\nWe've been looking potentially at a hybrid DID plus rekor\r\narchitecture (DIDs eventually as a proxy to) \r\n\r\nReferences:\r\n- https://github.com/sigstore/rekor\r\n",
"createdAt": "2022-07-19T15:41:13Z",
"userContentEdits": {
"pageInfo": {
"endCursor": "Y3Vyc29yOnYyOpHOABNjvA==",
"hasNextPage": false
},
"totalCount": 12,
"nodes": [
{
"diff": "## 2022-07-19 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n- Some good spdx DAG stuff on how we turn source into build SBOM wise\r\n - https://lists.spdx.org/g/Spdx-tech/message/4659\r\n- References\r\n - https://github.com/nsmith5/rekor-sidekick\r\n - > Rekor transparency log monitoring and alerting\r\n - Leverages Open Policy Agent\r\n - Found while looking at Open Policy Agent to see if we can serialize to JSON.\r\n - Possibly use to facilitate our downstream validation\r\n - https://github.com/intel/dffml/issues/1315\r\n - https://mermaid-js.github.io/mermaid/#/c4c\r\n - Mermaid is working on native https://c4model.com support!\r\n - W3C approves DIDs!\r\n - https://blog.avast.com/dids-approved-w3c\r\n - https://www.w3.org/blog/news/archives/9618\r\n - https://www.w3.org/2022/07/pressrelease-did-rec.html.en\r\n - https://twitter.com/w3c/status/1549368259878723585/retweets/with_comments\r\n\r\n> \"Intel Corporation congratulates the DID Working Group on Decentralized Identifier (DID) 1.0 reaching W3C Recommendation status.\r\n> DID provides a framework to unify and consolidate multiple evolving identity systems. Consolidating identity systems within a single framework is useful for validating the authenticity of information and preserving its integrity as it is moved and processed among cloud, edge, and client systems. This potentially increases the capabilities of the Web to connect and unify multiple sources of information.\r\n\r\nThe continuing evolution of this work will be key to the development of new technologies in the fields of supply chain management and Internet of Things (IoT) devices and services. For example, a Birds of a Feather (BOF) discussion group at IETF [Supply Chain Integrity, Transparency, and Trust (SCITT)](https://datatracker.ietf.org/doc/bofreq-birkholz-supply-chain-integrity-transparency-and-trust-scitt/) has already highlighted DID as a useful approach in providing much needed structure for exchanging information through the supply chain, and the Web of Things (WoT) WG is planning to support DID for identifying and discovering IoT devices and metadata.\r\n\r\nIntel Corporation supports this work and encourages the DID Working Group to continue working towards the convergence of widely implemented and adopted standardized best practices for identity in its next charter.\"\r\n\r\nEric Siow, Web Standards and Ecosystem Strategies Director, Intel Corporation\r\n\r\n\r\n\r\n\r\n- https://blog.devgenius.io/top-10-architecture-characteristics-non-functional-requirements-with-cheatsheat-7ad14bbb0a9b\r\n\r\n![image](https://user-images.githubusercontent.com/5950433/179842612-5fb02fb5-1f26-4cb4-af0d-d375b1134ace.png)\r\n\r\n- For Vol 3, on mind control\r\n - https://bigthink.com/the-present/sophists/\r\n\r\n---\r\n\r\nUnsent to Mike Scovetta: michael.scovetta (at) microsoft.com\r\n\r\nHi Mike,\r\n\r\nHope you\u2019ve been well. It\u2019s John from Intel. Thanks again to you and the team for welcoming me to the Identifying Security Threats working group meeting [2021-02-18](https://docs.google.com/document/d/1AfI0S6VjBCO0ZkULCYZGHuzzW8TPqO3zYxRjzmKvUB4/edit#heading=h.mfw2bj5svu9u) last year. We talked a bit about how Intel had a similar effort. I then changed roles hoping to get more involved with OpenSSF but then ended up getting told to be uninvolved. Now I switched roles again and involvement is in scope! Sorry for the lapse in communications.\r\n\r\nI periodically check the minutes so I joined today and asked about the \"Alpha-Omega\" project from last week\u2019s minutes which I then did some research on. We just started what looks to me to be an aligned project, coincidentally named Alice Omega Alpha: https://github.com/intel/dffml/tree/alice/entities/alice\r\n\r\nIt looks to me like Alice's mission to proactively enable developers and organizations to deliver organizationally context aware, adaptive secure by default best practices to teams aligns with project Alpha-Omega\u2019s goals. Alice is a part of Intel's software supply chain security strategy and her initial focus is on securing Intel\u2019s software supply chain and facilitating adoption of InnerSource practices.\r\n\r\nAlice is the nickname for both the entity and the architecture, the Open Architecture, which is a methodology for interpretation of existing well established, formats, protocols, and other domain specific representations of architecture. What we end up with is some JSON, YAML, or other blob of structured data that we can use to build cross language tooling focused more on policy and intent, incorporating data from arbitrary sources to create a holistic picture of software across dependency boundaries by focusing on threat models.\r\n\r\nAlice will be doing scans of open source projects and we\u2019d still love to collaborate to contribute metrics to the OpenSSF metrics database, we can easily have her shoot applicable metrics off to that DB. We\u2019ve also been looking at fusing VEX and DIDs to facilitate distributed vulnerability disclosure and patch distribution.\r\n\r\n---\r\n\r\nUnset to Jun Takei: jun.takei (at) intel.com\r\n\r\nThe W3C today issued the recommendation on DIDs. Jun I saw from Eric's\r\ncomment on the press release that the SCITT working group has an SCITT\r\nArchitecture which DID's might be suitable for.\r\n\r\nThe DFFML community is working on a project called Alice\r\nhttps://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice\r\nshe is intended to be a developer helper. She's also the way we data mine\r\nsource repositories (etc.).\r\n\r\nShe\u2019s open source with a plugin system (\"overlays\") so we can write open source code\r\nand then just add our internal integrations. This system relies on an abstraction of\r\narchitecture known as the Open Architecture. The Open Architecture, also known as\r\nAlice, is a methodology for interpreting directed graphs of domain specific architectures.\r\nAlice is the name we give both the entity and the architecture. We are hoping to\r\nhave Alice store and process information backed by directed graphs of DIDs, SBOMs, and\r\nVEX info primarily. This sounds very similar to the SCITT Architecture. We would love to\r\ncollaborate with you both to help make SCITT a success. Alice is focused on analysis of\r\nour software supply chain so as to ensure we conform to best practices. We would like\r\nthe analysis to serialize directly to an industry best practice format for that as well,\r\nwhich SCITT looks to be.\r\n\r\nTo increase the level of trust in our supply chain we would like to ensure interoperability\r\nup and down the stack. Ned is involved in the DICE space and communicated to me\r\nthat \r\n\r\nPlease let us know where things are at with your involvement with DIDs and SCITT so we\r\ncan be in sync with Intel's involvement and direction in this space. Please also let us know\r\nhow we could best establish an ongoing line of communication so as to build off and\r\ncontribute to where possible the work you're involved in.\r\n\r\nReferences:\r\n- https://datatracker.ietf.org/doc/html/draft-birkholz-scitt-architecture\r\n- https://www.w3.org/2022/07/pressrelease-did-rec.html.en\r\n- https://docs.microsoft.com/en-us/azure/confidential-ledger/architecture\r\n\r\n---\r\n\r\nUnsent\r\n\r\nTo: Jun and Mike and Dan lorenc.d (at) gmail.com\r\n\r\nI commented on the OpenSFF Stream 8 doc recommending that DIDs be looked at\r\nas a way to exchange vulnerability information.\r\n\r\nWe've been looking potentially at a hybrid DID plus rekor\r\narchitecture (DIDs eventually as a proxy to) \r\n\r\nReferences:\r\n- https://github.com/sigstore/rekor\r\n",
"editedAt": "2022-07-22T19:12:19Z"
},
{
"diff": "## 2022-07-19 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n- Some good spdx DAG stuff on how we turn source into build SBOM wise\r\n - https://lists.spdx.org/g/Spdx-tech/message/4659\r\n- References\r\n - https://github.com/nsmith5/rekor-sidekick\r\n - > Rekor transparency log monitoring and alerting\r\n - Leverages Open Policy Agent\r\n - Found while looking at Open Policy Agent to see if we can serialize to JSON.\r\n - Possibly use to facilitate our downstream validation\r\n - https://github.com/intel/dffml/issues/1315\r\n - https://mermaid-js.github.io/mermaid/#/c4c\r\n - Mermaid is working on native https://c4model.com support!\r\n - W3C approves DIDs!\r\n - https://blog.avast.com/dids-approved-w3c\r\n - https://www.w3.org/blog/news/archives/9618\r\n - https://www.w3.org/2022/07/pressrelease-did-rec.html.en\r\n - https://twitter.com/w3c/status/1549368259878723585/retweets/with_comments\r\n\r\n> \"Intel Corporation congratulates the DID Working Group on Decentralized Identifier (DID) 1.0 reaching W3C Recommendation status.\r\n> DID provides a framework to unify and consolidate multiple evolving identity systems. Consolidating identity systems within a single framework is useful for validating the authenticity of information and preserving its integrity as it is moved and processed among cloud, edge, and client systems. This potentially increases the capabilities of the Web to connect and unify multiple sources of information.\r\n\r\nThe continuing evolution of this work will be key to the development of new technologies in the fields of supply chain management and Internet of Things (IoT) devices and services. For example, a Birds of a Feather (BOF) discussion group at IETF [Supply Chain Integrity, Transparency, and Trust (SCITT)](https://datatracker.ietf.org/doc/bofreq-birkholz-supply-chain-integrity-transparency-and-trust-scitt/) has already highlighted DID as a useful approach in providing much needed structure for exchanging information through the supply chain, and the Web of Things (WoT) WG is planning to support DID for identifying and discovering IoT devices and metadata.\r\n\r\nIntel Corporation supports this work and encourages the DID Working Group to continue working towards the convergence of widely implemented and adopted standardized best practices for identity in its next charter.\"\r\n\r\nEric Siow, Web Standards and Ecosystem Strategies Director, Intel Corporation\r\n\r\n\r\n\r\n\r\n- https://blog.devgenius.io/top-10-architecture-characteristics-non-functional-requirements-with-cheatsheat-7ad14bbb0a9b\r\n\r\n![image](https://user-images.githubusercontent.com/5950433/179842612-5fb02fb5-1f26-4cb4-af0d-d375b1134ace.png)\r\n\r\n- For Vol 3, on mind control\r\n - https://bigthink.com/the-present/sophists/\r\n\r\n---\r\n\r\nUnsent to Mike Scovetta: michael.scovetta (at) microsoft.com\r\n\r\nHi Mike,\r\n\r\nHope you\u2019ve been well. It\u2019s John from Intel. Thanks again to you and the team for welcoming me to the Identifying Security Threats working group meeting [2021-02-18](https://docs.google.com/document/d/1AfI0S6VjBCO0ZkULCYZGHuzzW8TPqO3zYxRjzmKvUB4/edit#heading=h.mfw2bj5svu9u) last year. We talked a bit about how Intel had a similar effort. I then changed roles hoping to get more involved with OpenSSF but then ended up getting told to be uninvolved. Now I switched roles again and involvement is in scope! Sorry for the lapse in communications.\r\n\r\nI periodically check the minutes so I joined today and asked about the \"Alpha-Omega\" project from last week\u2019s minutes which I then did some research on. We just started what looks to me to be an aligned project, coincidentally named Alice Omega Alpha: https://github.com/intel/dffml/tree/alice/entities/alice\r\n\r\nIt looks to me like Alice's mission to proactively enable developers and organizations to deliver organizationally context aware, adaptive secure by default best practices to teams aligns with project Alpha-Omega\u2019s goals. Alice is a part of Intel's software supply chain security strategy and her initial focus is on securing Intel\u2019s software supply chain and facilitating adoption of InnerSource practices.\r\n\r\nAlice is the nickname for both the entity and the architecture, the Open Architecture, which is a methodology for interpretation of existing well established, formats, protocols, and other domain specific representations of architecture. What we end up with is some JSON, YAML, or other blob of structured data that we can use to build cross language tooling focused more on policy and intent, incorporating data from arbitrary sources to create a holistic picture of software across dependency boundaries by focusing on threat models.\r\n\r\nAlice will be doing scans of open source projects and we\u2019d still love to collaborate to contribute metrics to the OpenSSF metrics database, we can easily have her shoot applicable metrics off to that DB. We\u2019ve also been looking at fusing VEX and DIDs to facilitate distributed vulnerability disclosure and patch distribution.\r\n\r\n---\r\n\r\nUnset to Jun Takei: jun.takei (at) intel.com\r\n\r\nThe W3C today issued the recommendation on DIDs. Jun I saw from Eric's\r\ncomment on the press release that the SCITT working group has an SCITT\r\nArchitecture which DID's might be suitable for.\r\n\r\nThe DFFML community is working on a project called Alice\r\nhttps://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice\r\nshe is intended to be a developer helper. She's also the way we data mine\r\nsource repositories (etc.).\r\n\r\nShe\u2019s open source with a plugin system (\"overlays\") so we can write open source code\r\nand then just add our internal integrations. This system relies on an abstraction of\r\narchitecture known as the Open Architecture. The Open Architecture, also known as\r\nAlice, is a methodology for interpreting directed graphs of domain specific architectures.\r\nAlice is the name we give both the entity and the architecture. We are hoping to\r\nhave Alice store and process information backed by directed graphs of DIDs, SBOMs, and\r\nVEX info primarily. This sounds very similar to the SCITT Architecture. We would love to\r\ncollaborate with you both to help make SCITT a success. Alice is focused on analysis of\r\nour software supply chain so as to ensure we conform to best practices. We would like\r\nthe analysis to serialize directly to an industry best practice format for that as well,\r\nwhich SCITT looks to be.\r\n\r\nTo increase the level of trust in our supply chain we would like to ensure interoperability\r\nup and down the stack. Ned is involved in the DICE space and communicated to me\r\nthat \r\n\r\nPlease let us know where things are at with your involvement with DIDs and SCITT so we\r\ncan be in sync with Intel's involvement and direction in this space. Please also let us know\r\nhow we could best establish an ongoing line of communication so as to build off and\r\ncontribute to where possible the work you're involved in.\r\n\r\nReferences:\r\n- https://datatracker.ietf.org/doc/html/draft-birkholz-scitt-architecture\r\n- https://www.w3.org/2022/07/pressrelease-did-rec.html.en\r\n- https://docs.microsoft.com/en-us/azure/confidential-ledger/architecture\r\n\r\n---\r\n\r\nUnsent\r\n\r\nTo: Jun and Mike and Dan lorenc.d (at) gmail.com\r\n\r\nI commented on the OpenSFF Stream 8 doc recommending that DIDs be looked at\r\nas a way to exchange vulnerability information.\r\n\r\nWe've been looking potentially at a hybrid DID plus rekor\r\narchitecture (DIDs eventually as a proxy to) \r\n\r\nReferences:\r\n- https://github.com/sigstore/rekor\r\n",
"editedAt": "2022-07-20T03:22:57Z"
},
{
"diff": "## 2022-07-19 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n- Some good spdx DAG stuff on how we turn source into build SBOM wise\r\n - https://lists.spdx.org/g/Spdx-tech/message/4659\r\n- References\r\n - https://github.com/nsmith5/rekor-sidekick\r\n - > Rekor transparency log monitoring and alerting\r\n - Leverages Open Policy Agent\r\n - Found while looking at Open Policy Agent to see if we can serialize to JSON.\r\n - Possibly use to facilitate our downstream validation\r\n - https://github.com/intel/dffml/issues/1315\r\n - https://mermaid-js.github.io/mermaid/#/c4c\r\n - Mermaid is working on native https://c4model.com support!\r\n - W3C approves DIDs!\r\n - https://blog.avast.com/dids-approved-w3c\r\n - https://www.w3.org/blog/news/archives/9618\r\n - https://www.w3.org/2022/07/pressrelease-did-rec.html.en\r\n - https://twitter.com/w3c/status/1549368259878723585/retweets/with_comments\r\n\r\n> \"Intel Corporation congratulates the DID Working Group on Decentralized Identifier (DID) 1.0 reaching W3C Recommendation status.\r\n> DID provides a framework to unify and consolidate multiple evolving identity systems. Consolidating identity systems within a single framework is useful for validating the authenticity of information and preserving its integrity as it is moved and processed among cloud, edge, and client systems. This potentially increases the capabilities of the Web to connect and unify multiple sources of information.\r\n\r\nThe continuing evolution of this work will be key to the development of new technologies in the fields of supply chain management and Internet of Things (IoT) devices and services. For example, a Birds of a Feather (BOF) discussion group at IETF [Supply Chain Integrity, Transparency, and Trust (SCITT)](https://datatracker.ietf.org/doc/bofreq-birkholz-supply-chain-integrity-transparency-and-trust-scitt/) has already highlighted DID as a useful approach in providing much needed structure for exchanging information through the supply chain, and the Web of Things (WoT) WG is planning to support DID for identifying and discovering IoT devices and metadata.\r\n\r\nIntel Corporation supports this work and encourages the DID Working Group to continue working towards the convergence of widely implemented and adopted standardized best practices for identity in its next charter.\"\r\n\r\nEric Siow, Web Standards and Ecosystem Strategies Director, Intel Corporation\r\n\r\n\r\n\r\n\r\n- https://blog.devgenius.io/top-10-architecture-characteristics-non-functional-requirements-with-cheatsheat-7ad14bbb0a9b\r\n\r\n![image](https://user-images.githubusercontent.com/5950433/179842612-5fb02fb5-1f26-4cb4-af0d-d375b1134ace.png)\r\n\r\n- For Vol 3, on mind control\r\n - https://bigthink.com/the-present/sophists/\r\n\r\n---\r\n\r\nUnsent to Mike Scovetta: michael.scovetta (at) microsoft.com\r\n\r\nHi Mike,\r\n\r\nHope you\u2019ve been well. It\u2019s John from Intel. Thanks again to you and the team for welcoming me to the Identifying Security Threats working group meeting [2021-02-18](https://docs.google.com/document/d/1AfI0S6VjBCO0ZkULCYZGHuzzW8TPqO3zYxRjzmKvUB4/edit#heading=h.mfw2bj5svu9u) last year. We talked a bit about how Intel had a similar effort. I then changed roles hoping to get more involved with OpenSSF but then ended up getting told to be uninvolved. Now I switched roles again and involvement is in scope! Sorry for the lapse in communications.\r\n\r\nI periodically check the minutes so I joined today and asked about the \"Alpha-Omega\" project from last week\u2019s minutes which I then did some research on. We just started what looks to me to be an aligned project, coincidentally named Alice Omega Alpha: https://github.com/intel/dffml/tree/alice/entities/alice\r\n\r\nIt looks to me like Alice's mission to proactively enable developers and organizations to deliver organizationally context aware, adaptive secure by default best practices to teams aligns with project Alpha-Omega\u2019s goals. Alice is a part of Intel's software supply chain security strategy and her initial focus is on securing Intel\u2019s software supply chain and facilitating adoption of InnerSource practices.\r\n\r\nAlice is the nickname for both the entity and the architecture, the Open Architecture, which is a methodology for interpretation of existing well established, formats, protocols, and other domain specific representations of architecture. What we end up with is some JSON, YAML, or other blob of structured data that we can use to build cross language tooling focused more on policy and intent, incorporating data from arbitrary sources to create a holistic picture of software across dependency boundaries by focusing on threat models.\r\n\r\nAlice will be doing scans of open source projects and we\u2019d still love to collaborate to contribute metrics to the OpenSSF metrics database, we can easily have her shoot applicable metrics off to that DB. We\u2019ve also been looking at fusing VEX and DIDs to facilitate distributed vulnerability disclosure and patch distribution.\r\n\r\n---\r\n\r\nUnset to Jun Takei: jun.takei (at) intel.com\r\n\r\nThe W3C today issued the recommendation on DIDs. Jun I saw from Eric's\r\ncomment on the press release that the SCITT working group has an SCITT\r\nArchitecture which DID's might be suitable for.\r\n\r\nThe DFFML community is working on a project called Alice\r\nhttps://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice\r\nshe is intended to be a developer helper. She's also the way we data mine\r\nsource repositories (etc.).\r\n\r\nShe\u2019s open source with a plugin system (\"overlays\") so we can write open source code\r\nand then just add our internal integrations. This system relies on an abstraction of\r\narchitecture known as the Open Architecture. The Open Architecture, also known as\r\nAlice, is a methodology for interpreting directed graphs of domain specific architectures.\r\nAlice is the name we give both the entity and the architecture. We are hoping to\r\nhave Alice store and process information backed by directed graphs of DIDs, SBOMs, and\r\nVEX info primarily. This sounds very similar to the SCITT Architecture. We would love to\r\ncollaborate with you both to help make SCITT a success. Alice is focused on analysis of\r\nour software supply chain so as to ensure we conform to best practices. We would like\r\nthe analysis to serialize directly to an industry best practice format for that as well,\r\nwhich SCITT looks to be.\r\n\r\nTo increase the level of trust in our supply chain we would like to ensure interoperability\r\nup and down the stack. Ned is involved in the DICE space and communicated to me\r\nthat \r\n\r\nPlease let us know where things are at with your involvement with DIDs and SCITT so we\r\ncan be in sync with Intel's involvement and direction in this space. Please also let us know\r\nhow we could best establish an ongoing line of communication so as to build off and\r\ncontribute to where possible the work you're involved in.\r\n\r\nReferences:\r\n- https://datatracker.ietf.org/doc/html/draft-birkholz-scitt-architecture\r\n- https://www.w3.org/2022/07/pressrelease-did-rec.html.en\r\n- https://docs.microsoft.com/en-us/azure/confidential-ledger/architecture\r\n\r\n---\r\n\r\nUnsent\r\n\r\nTo: Jun Takei and Mike\r\n\r\nI commented on the OpenSFF Stream 8 doc recommending that DIDs be looked at\r\nas a way to exchange vulnerability information. \r\n\r\n",
"editedAt": "2022-07-20T03:19:41Z"
},
{
"diff": "## 2022-07-19 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n- Some good spdx DAG stuff on how we turn source into build SBOM wise\r\n - https://lists.spdx.org/g/Spdx-tech/message/4659\r\n- References\r\n - https://github.com/nsmith5/rekor-sidekick\r\n - > Rekor transparency log monitoring and alerting\r\n - Leverages Open Policy Agent\r\n - Found while looking at Open Policy Agent to see if we can serialize to JSON.\r\n - Possibly use to facilitate our downstream validation\r\n - https://github.com/intel/dffml/issues/1315\r\n - https://mermaid-js.github.io/mermaid/#/c4c\r\n - Mermaid is working on native https://c4model.com support!\r\n - W3C approves DIDs!\r\n - https://blog.avast.com/dids-approved-w3c\r\n - https://www.w3.org/blog/news/archives/9618\r\n - https://www.w3.org/2022/07/pressrelease-did-rec.html.en\r\n - https://twitter.com/w3c/status/1549368259878723585/retweets/with_comments\r\n\r\n> \"Intel Corporation congratulates the DID Working Group on Decentralized Identifier (DID) 1.0 reaching W3C Recommendation status.\r\n> DID provides a framework to unify and consolidate multiple evolving identity systems. Consolidating identity systems within a single framework is useful for validating the authenticity of information and preserving its integrity as it is moved and processed among cloud, edge, and client systems. This potentially increases the capabilities of the Web to connect and unify multiple sources of information.\r\n\r\nThe continuing evolution of this work will be key to the development of new technologies in the fields of supply chain management and Internet of Things (IoT) devices and services. For example, a Birds of a Feather (BOF) discussion group at IETF [Supply Chain Integrity, Transparency, and Trust (SCITT)](https://datatracker.ietf.org/doc/bofreq-birkholz-supply-chain-integrity-transparency-and-trust-scitt/) has already highlighted DID as a useful approach in providing much needed structure for exchanging information through the supply chain, and the Web of Things (WoT) WG is planning to support DID for identifying and discovering IoT devices and metadata.\r\n\r\nIntel Corporation supports this work and encourages the DID Working Group to continue working towards the convergence of widely implemented and adopted standardized best practices for identity in its next charter.\"\r\n\r\nEric Siow, Web Standards and Ecosystem Strategies Director, Intel Corporation\r\n\r\n\r\n\r\n\r\n- https://blog.devgenius.io/top-10-architecture-characteristics-non-functional-requirements-with-cheatsheat-7ad14bbb0a9b\r\n\r\n![image](https://user-images.githubusercontent.com/5950433/179842612-5fb02fb5-1f26-4cb4-af0d-d375b1134ace.png)\r\n\r\n- For Vol 3, on mind control\r\n - https://bigthink.com/the-present/sophists/\r\n\r\n---\r\n\r\nUnsent to Mike Scovetta: michael.scovetta (at) microsoft.com\r\n\r\nHi Mike,\r\n\r\nHope you\u2019ve been well. It\u2019s John from Intel. Thanks again to you and the team for welcoming me to the Identifying Security Threats working group meeting [2021-02-18](https://docs.google.com/document/d/1AfI0S6VjBCO0ZkULCYZGHuzzW8TPqO3zYxRjzmKvUB4/edit#heading=h.mfw2bj5svu9u) last year. We talked a bit about how Intel had a similar effort. I then changed roles hoping to get more involved with OpenSSF but then ended up getting told to be uninvolved. Now I switched roles again and involvement is in scope! Sorry for the lapse in communications.\r\n\r\nI periodically check the minutes so I joined today and asked about the \"Alpha-Omega\" project from last week\u2019s minutes which I then did some research on. We just started what looks to me to be an aligned project, coincidentally named Alice Omega Alpha: https://github.com/intel/dffml/tree/alice/entities/alice\r\n\r\nIt looks to me like Alice's mission to proactively enable developers and organizations to deliver organizationally context aware, adaptive secure by default best practices to teams aligns with project Alpha-Omega\u2019s goals. Alice is a part of Intel's software supply chain security strategy and her initial focus is on securing Intel\u2019s software supply chain and facilitating adoption of InnerSource practices.\r\n\r\nAlice is the nickname for both the entity and the architecture, the Open Architecture, which is a methodology for interpretation of existing well established, formats, protocols, and other domain specific representations of architecture. What we end up with is some JSON, YAML, or other blob of structured data that we can use to build cross language tooling focused more on policy and intent, incorporating data from arbitrary sources to create a holistic picture of software across dependency boundaries by focusing on threat models.\r\n\r\nAlice will be doing scans of open source projects and we\u2019d still love to collaborate to contribute metrics to the OpenSSF metrics database, we can easily have her shoot applicable metrics off to that DB. We\u2019ve also been looking at fusing VEX and DIDs to facilitate distributed vulnerability disclosure and patch distribution.\r\n\r\n---\r\n\r\nTo: Jun Takei: jun.takei (at) intel.com\r\n\r\nThe W3C today issued the recommendation on DIDs. Jun I saw from Eric's\r\ncomment on the press release that the SCITT working group has an SCITT\r\nArchitecture which DID's might be suitable for.\r\n\r\nMike we've\r\nI commented on the OpenSFF Stream 8 doc recommending that DIDs be looked at\r\nas a way to exchange vulnerability information. \r\n\r\n\r\nThe DFFML community is working on a project called Alice\r\nhttps://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice\r\nshe is intended to be a developer helper. She's also the way we data mine\r\nsource repositories (etc.).\r\n\r\nShe\u2019s open source with a plugin system (\"overlays\") so we can write open source code\r\nand then just add our internal integrations. This system relies on an abstraction of\r\narchitecture known as the Open Architecture. The Open Architecture, also known as\r\nAlice, is a methodology for interpreting directed graphs of domain specific architectures.\r\nAlice is the name we give both the entity and the architecture. We are hoping to\r\nhave Alice store and process information backed by directed graphs of DIDs, SBOMs, and\r\nVEX info primarily. This sounds very similar to the SCITT Architecture. We would love to\r\ncollaborate with you both to help make SCITT a success. Alice is focused on analysis of\r\nour software supply chain so as to ensure we conform to best practices. We would like\r\nthe analysis to serialize directly to an industry best practice format for that as well,\r\nwhich SCITT looks to be.\r\n\r\nTo increase the level of trust in our supply chain we would like to ensure interoperability\r\nup and down the stack. Ned is involved in the DICE space and communicated to me\r\nthat \r\n\r\nPlease let us know where things are at with your involvement with DIDs and SCITT so we\r\ncan be in sync with Intel's involvement and direction in this space. Please also let us know\r\nhow we could best establish an ongoing line of communication so as to build off and\r\ncontribute to where possible the work you're involved in.\r\n\r\nReferences:\r\n- https://datatracker.ietf.org/doc/html/draft-birkholz-scitt-architecture\r\n- https://www.w3.org/2022/07/pressrelease-did-rec.html.en\r\n- https://docs.microsoft.com/en-us/azure/confidential-ledger/architecture\r\n\r\n---\r\n\r\nUnsent\r\n\r\nTo: Jun Takei and Mike\r\n",
"editedAt": "2022-07-20T03:19:01Z"
},
{
"diff": "## 2022-07-19 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n- Some good spdx DAG stuff on how we turn source into build SBOM wise\r\n - https://lists.spdx.org/g/Spdx-tech/message/4659\r\n- References\r\n - https://github.com/nsmith5/rekor-sidekick\r\n - > Rekor transparency log monitoring and alerting\r\n - Leverages Open Policy Agent\r\n - Found while looking at Open Policy Agent to see if we can serialize to JSON.\r\n - Possibly use to facilitate our downstream validation\r\n - https://github.com/intel/dffml/issues/1315\r\n - https://mermaid-js.github.io/mermaid/#/c4c\r\n - Mermaid is working on native https://c4model.com support!\r\n - W3C approves DIDs!\r\n - https://blog.avast.com/dids-approved-w3c\r\n - https://www.w3.org/blog/news/archives/9618\r\n - https://www.w3.org/2022/07/pressrelease-did-rec.html.en\r\n - https://twitter.com/w3c/status/1549368259878723585/retweets/with_comments\r\n\r\n> \"Intel Corporation congratulates the DID Working Group on Decentralized Identifier (DID) 1.0 reaching W3C Recommendation status.\r\n> DID provides a framework to unify and consolidate multiple evolving identity systems. Consolidating identity systems within a single framework is useful for validating the authenticity of information and preserving its integrity as it is moved and processed among cloud, edge, and client systems. This potentially increases the capabilities of the Web to connect and unify multiple sources of information.\r\n\r\nThe continuing evolution of this work will be key to the development of new technologies in the fields of supply chain management and Internet of Things (IoT) devices and services. For example, a Birds of a Feather (BOF) discussion group at IETF [Supply Chain Integrity, Transparency, and Trust (SCITT)](https://datatracker.ietf.org/doc/bofreq-birkholz-supply-chain-integrity-transparency-and-trust-scitt/) has already highlighted DID as a useful approach in providing much needed structure for exchanging information through the supply chain, and the Web of Things (WoT) WG is planning to support DID for identifying and discovering IoT devices and metadata.\r\n\r\nIntel Corporation supports this work and encourages the DID Working Group to continue working towards the convergence of widely implemented and adopted standardized best practices for identity in its next charter.\"\r\n\r\nEric Siow, Web Standards and Ecosystem Strategies Director, Intel Corporation\r\n\r\n\r\n\r\n\r\n- https://blog.devgenius.io/top-10-architecture-characteristics-non-functional-requirements-with-cheatsheat-7ad14bbb0a9b\r\n\r\n![image](https://user-images.githubusercontent.com/5950433/179842612-5fb02fb5-1f26-4cb4-af0d-d375b1134ace.png)\r\n\r\n- For Vol 3, on mind control\r\n - https://bigthink.com/the-present/sophists/\r\n\r\n---\r\n\r\nUnsent to Mike: michael.scovetta@microsoft.com\r\n\r\nHi Mike,\r\n\r\nHope you\u2019ve been well. It\u2019s John from Intel. Thanks again to you and the team for welcoming me to the Identifying Security Threats working group meeting [2021-02-18](https://docs.google.com/document/d/1AfI0S6VjBCO0ZkULCYZGHuzzW8TPqO3zYxRjzmKvUB4/edit#heading=h.mfw2bj5svu9u) last year. We talked a bit about how Intel had a similar effort. I then changed roles hoping to get more involved with OpenSSF but then ended up getting told to be uninvolved. Now I switched roles again and involvement is in scope! Sorry for the lapse in communications.\r\n\r\nI periodically check the minutes so I joined today and asked about the \"Alpha-Omega\" project from last week\u2019s minutes which I then did some research on. We just started what looks to me to be an aligned project, coincidentally named Alice Omega Alpha: https://github.com/intel/dffml/tree/alice/entities/alice\r\n\r\nIt looks to me like Alice's mission to proactively enable developers and organizations to deliver organizationally context aware, adaptive secure by default best practices to teams aligns with project Alpha-Omega\u2019s goals. Alice is a part of Intel's software supply chain security strategy and her initial focus is on securing Intel\u2019s software supply chain and facilitating adoption of InnerSource practices.\r\n\r\nAlice is the nickname for both the entity and the architecture, the Open Architecture, which is a methodology for interpretation of existing well established, formats, protocols, and other domain specific representations of architecture. What we end up with is some JSON, YAML, or other blob of structured data that we can use to build cross language tooling focused more on policy and intent, incorporating data from arbitrary sources to create a holistic picture of software across dependency boundaries by focusing on threat models.\r\n\r\nAlice will be doing scans of open source projects and we\u2019d still love to collaborate to contribute metrics to the OpenSSF metrics database, we can easily have her shoot applicable metrics off to that DB. We\u2019ve also been looking at fusing VEX and DIDs to facilitate distributed vulnerability disclosure and patch distribution.\r\n\r\n---\r\n\r\nTo: Jun Takei and Microsoft Mike\r\n\r\nThe W3C today issued the recommendation on DIDs. Jun I saw from Eric's\r\ncomment on the press release that the SCITT working group has an SCITT\r\nArchitecture which DID's might be suitable for.\r\n\r\nMike we've\r\nI commented on the OpenSFF Stream 8 doc recommending that DIDs be looked at\r\nas a way to exchange vulnerability information. \r\n\r\n\r\nThe DFFML community is working on a project called Alice\r\nhttps://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice\r\nshe is intended to be a developer helper. She's also the way we data mine\r\nsource repositories (etc.).\r\n\r\nShe\u2019s open source with a plugin system (\"overlays\") so we can write open source code\r\nand then just add our internal integrations. This system relies on an abstraction of\r\narchitecture known as the Open Architecture. The Open Architecture, also known as\r\nAlice, is a methodology for interpreting directed graphs of domain specific architectures.\r\nAlice is the name we give both the entity and the architecture. We are hoping to\r\nhave Alice store and process information backed by directed graphs of DIDs, SBOMs, and\r\nVEX info primarily. This sounds very similar to the SCITT Architecture. We would love to\r\ncollaborate with you both to help make SCITT a success. Alice is focused on analysis of\r\nour software supply chain so as to ensure we conform to best practices. We would like\r\nthe analysis to serialize directly to an industry best practice format for that as well,\r\nwhich SCITT looks to be.\r\n\r\nTo increase the level of trust in our supply chain we would like to ensure interoperability\r\nup and down the stack. Ned is involved in the DICE space and communicated to me\r\nthat \r\n\r\nPlease let us know where things are at with your involvement with DIDs and SCITT so we\r\ncan be in sync with Intel's involvement and direction in this space. Please also let us know\r\nhow we could best establish an ongoing line of communication so as to build off and\r\ncontribute to where possible the work you're involved in.\r\n\r\nReferences:\r\n- https://datatracker.ietf.org/doc/html/draft-birkholz-scitt-architecture\r\n- https://www.w3.org/2022/07/pressrelease-did-rec.html.en\r\n- https://docs.microsoft.com/en-us/azure/confidential-ledger/architecture",
"editedAt": "2022-07-20T03:17:14Z"
},
{
"diff": "## 2022-07-19 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n- Some good spdx DAG stuff on how we turn source into build SBOM wise\r\n - https://lists.spdx.org/g/Spdx-tech/message/4659\r\n- References\r\n - https://github.com/nsmith5/rekor-sidekick\r\n - > Rekor transparency log monitoring and alerting\r\n - Leverages Open Policy Agent\r\n - Found while looking at Open Policy Agent to see if we can serialize to JSON.\r\n - Possibly use to facilitate our downstream validation\r\n - https://github.com/intel/dffml/issues/1315\r\n - https://mermaid-js.github.io/mermaid/#/c4c\r\n - Mermaid is working on native https://c4model.com support!\r\n - W3C approves DIDs!\r\n - https://blog.avast.com/dids-approved-w3c\r\n - https://www.w3.org/blog/news/archives/9618\r\n - https://twitter.com/w3c/status/1549368259878723585/retweets/with_comments\r\n- https://blog.devgenius.io/top-10-architecture-characteristics-non-functional-requirements-with-cheatsheat-7ad14bbb0a9b\r\n\r\n![image](https://user-images.githubusercontent.com/5950433/179842612-5fb02fb5-1f26-4cb4-af0d-d375b1134ace.png)\r\n\r\n- For Vol 3, on mind control\r\n - https://bigthink.com/the-present/sophists/",
"editedAt": "2022-07-20T01:44:35Z"
},
{
"diff": "## 2022-07-19 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n- Some good spdx DAG stuff on how we turn source into build SBOM wise\r\n - https://lists.spdx.org/g/Spdx-tech/message/4659\r\n- References\r\n - https://github.com/nsmith5/rekor-sidekick\r\n - > Rekor transparency log monitoring and alerting\r\n - Leverages Open Policy Agent\r\n - Found while looking at Open Policy Agent to see if we can serialize to JSON.\r\n - Possibly use to facilitate our downstream validation\r\n - https://github.com/intel/dffml/issues/1315\r\n - https://mermaid-js.github.io/mermaid/#/c4c\r\n - Mermaid is working on native https://c4model.com support!\r\n - W3C approves DIDs!\r\n - https://blog.avast.com/dids-approved-w3c\r\n - https://www.w3.org/blog/news/archives/9618\r\n - https://twitter.com/w3c/status/1549368259878723585/retweets/with_comments\r\n- https://blog.devgenius.io/top-10-architecture-characteristics-non-functional-requirements-with-cheatsheat-7ad14bbb0a9b\r\n\r\n![image](https://user-images.githubusercontent.com/5950433/179842612-5fb02fb5-1f26-4cb4-af0d-d375b1134ace.png)\r\n",
"editedAt": "2022-07-19T22:58:06Z"
},
{
"diff": "## 2022-07-19 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n- Some good spdx DAG stuff on how we turn source into build SBOM wise\r\n - https://lists.spdx.org/g/Spdx-tech/message/4659\r\n- References\r\n - https://github.com/nsmith5/rekor-sidekick\r\n - > Rekor transparency log monitoring and alerting\r\n - Leverages Open Policy Agent\r\n - Possibly use to facilitate our downstream validation\r\n - https://github.com/intel/dffml/issues/1315\r\n - https://mermaid-js.github.io/mermaid/#/c4c\r\n - Mermaid is working on native https://c4model.com support!\r\n - W3C approves DIDs!\r\n - https://blog.avast.com/dids-approved-w3c\r\n - https://www.w3.org/blog/news/archives/9618\r\n - https://twitter.com/w3c/status/1549368259878723585/retweets/with_comments\r\n- https://blog.devgenius.io/top-10-architecture-characteristics-non-functional-requirements-with-cheatsheat-7ad14bbb0a9b\r\n\r\n![image](https://user-images.githubusercontent.com/5950433/179842612-5fb02fb5-1f26-4cb4-af0d-d375b1134ace.png)\r\n",
"editedAt": "2022-07-19T22:57:43Z"
},
{
"diff": "## 2022-07-19 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n- Some good spdx DAG stuff on how we turn source into build SBOM wise\r\n - https://lists.spdx.org/g/Spdx-tech/message/4659\r\n- References\r\n - https://mermaid-js.github.io/mermaid/#/c4c\r\n - Mermaid is working on native https://c4model.com support!\r\n - W3C approves DIDs!\r\n - https://blog.avast.com/dids-approved-w3c\r\n - https://www.w3.org/blog/news/archives/9618\r\n - https://twitter.com/w3c/status/1549368259878723585/retweets/with_comments\r\n- https://blog.devgenius.io/top-10-architecture-characteristics-non-functional-requirements-with-cheatsheat-7ad14bbb0a9b\r\n\r\n![image](https://user-images.githubusercontent.com/5950433/179842612-5fb02fb5-1f26-4cb4-af0d-d375b1134ace.png)\r\n",
"editedAt": "2022-07-19T20:30:31Z"
},
{
"diff": "## 2022-07-19 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n- Some good spdx DAG stuff on how we turn source into build SBOM wise\r\n - https://lists.spdx.org/g/Spdx-tech/message/4659\r\n- References\r\n - https://mermaid-js.github.io/mermaid/#/c4c\r\n - Mermaid is working on native https://c4model.com support!\r\n - W3C approves DIDs!\r\n - https://blog.avast.com/dids-approved-w3c\r\n - https://www.w3.org/blog/news/archives/9618\r\n - https://twitter.com/w3c/status/1549368259878723585/retweets/with_comments\r\n - ",
"editedAt": "2022-07-19T20:15:06Z"
},
{
"diff": "## 2022-07-19 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n- Some good spdx DAG stuff on how we turn source into build SBOM wise\r\n - https://lists.spdx.org/g/Spdx-tech/message/4659\r\n- References\r\n - https://mermaid-js.github.io/mermaid/#/c4c\r\n - Mermaid is working on native https://c4model.com support!\r\n - W3C approves DIDs!\r\n - https://blog.avast.com/dids-approved-w3c\r\n - https://www.w3.org/blog/news/archives/9618\r\n - https://twitter.com/w3c/status/1549368259878723585/retweets/with_comments\r\n - ",
"editedAt": "2022-07-19T16:39:25Z"
},
{
"diff": "## 2022-07-19 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n- References\r\n - https://mermaid-js.github.io/mermaid/#/c4c\r\n - Mermaid is working on native https://c4model.com support!\r\n - W3C approves DIDs!\r\n - https://blog.avast.com/dids-approved-w3c\r\n - https://www.w3.org/blog/news/archives/9618\r\n - https://twitter.com/w3c/status/1549368259878723585/retweets/with_comments",
"editedAt": "2022-07-19T15:41:13Z"
}
]
}
}
]
}
},
{
"body": "# 2022-07-20 Engineering Logs\r\n\r\n- TODO\r\n - [x] @aliceoa, @pdxjohnny: Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] @pdxjohnny: Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] @dffml: Get involved in SCITT\r\n - [ ] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.",
"createdAt": "2022-07-20T14:49:37Z",
"userContentEdits": {
"pageInfo": {
"endCursor": "Y3Vyc29yOnYyOpHOABNwOA==",
"hasNextPage": false
},
"totalCount": 4,
"nodes": [
{
"diff": "# 2022-07-20 Engineering Logs\r\n\r\n- TODO\r\n - [x] @aliceoa, @pdxjohnny: Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] @pdxjohnny: Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] @dffml: Get involved in SCITT\r\n - [ ] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.",
"editedAt": "2022-07-22T19:12:12Z"
},
{
"diff": "# 2022-07-20 Engineering Logs\r\n\r\n- TODO\r\n - [ ] @aliceoa, @pdxjohnny: Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] @pdxjohnny: Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] @dffml: Get involved in SCITT\r\n - [ ] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.",
"editedAt": "2022-07-20T14:57:23Z"
},
{
"diff": "# 2022-07-20 Engineering Logs\r\n\r\n- TODO\r\n - [ ] @aliceoa, @pdxjohnny: Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] @pdxjohnny: Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting",
"editedAt": "2022-07-20T14:49:43Z"
},
{
"diff": "# 2022-07-19 Engineering Logs\r\n\r\n- TODO\r\n - [ ] @aliceoa, @pdxjohnny: Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] @pdxjohnny: Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting",
"editedAt": "2022-07-20T14:49:37Z"
}
]
},
"replies": {
"pageInfo": {
"endCursor": "Y3Vyc29yOnYyOpK5MjAyMi0wNy0yMFQxMDowOTowNS0wNzowMM4AMLH8",
"hasNextPage": false
},
"totalCount": 2,
"nodes": [
{
"body": "## 2022-07-20 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [ ] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n- References\r\n - https://static.sched.com/hosted_files/ossna2022/9b/presentation.pdf\r\n - > We're starting to put everything in registries, container images, signatures, SBOMs, attestations, cat pictures, we need to slow down. Our CI pipelines are designed to pass things as directories and files between stages, why aren't we doing this with our container images? OCI already defines an Image Layout Specification that defines how to structure the data on disk, and we should normalize how this is used in our tooling. This talk looks at the value of using the OCI Layout spec, what you can do today, what issues we're facing, and a call to action for more standardization between tooling in this space.\r\n\r\n---\r\n\r\nUnsent\r\n\r\nTo: Jun and Mike and Yan\r\n\r\nI commented on the OpenSFF Stream 8 doc recommending that DIDs be looked at\r\nas a way to exchange vulnerability information.\r\n\r\nWe've been looking potentially at a hybrid DID plus rekor\r\narchitecture (DIDs eventually as a proxy to) \r\n\r\nReferences:\r\n- https://github.com/sigstore/rekor\r\n",
"createdAt": "2022-07-20T14:50:43Z",
"userContentEdits": {
"pageInfo": {
"endCursor": "Y3Vyc29yOnYyOpHOABNwTA==",
"hasNextPage": false
},
"totalCount": 6,
"nodes": [
{
"diff": "## 2022-07-20 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [ ] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n- References\r\n - https://static.sched.com/hosted_files/ossna2022/9b/presentation.pdf\r\n - > We're starting to put everything in registries, container images, signatures, SBOMs, attestations, cat pictures, we need to slow down. Our CI pipelines are designed to pass things as directories and files between stages, why aren't we doing this with our container images? OCI already defines an Image Layout Specification that defines how to structure the data on disk, and we should normalize how this is used in our tooling. This talk looks at the value of using the OCI Layout spec, what you can do today, what issues we're facing, and a call to action for more standardization between tooling in this space.\r\n\r\n---\r\n\r\nUnsent\r\n\r\nTo: Jun and Mike and Yan\r\n\r\nI commented on the OpenSFF Stream 8 doc recommending that DIDs be looked at\r\nas a way to exchange vulnerability information.\r\n\r\nWe've been looking potentially at a hybrid DID plus rekor\r\narchitecture (DIDs eventually as a proxy to) \r\n\r\nReferences:\r\n- https://github.com/sigstore/rekor\r\n",
"editedAt": "2022-07-22T19:12:07Z"
},
{
"diff": "## 2022-07-20 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [ ] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [ ] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n- References\r\n - https://static.sched.com/hosted_files/ossna2022/9b/presentation.pdf\r\n - > We're starting to put everything in registries, container images, signatures, SBOMs, attestations, cat pictures, we need to slow down. Our CI pipelines are designed to pass things as directories and files between stages, why aren't we doing this with our container images? OCI already defines an Image Layout Specification that defines how to structure the data on disk, and we should normalize how this is used in our tooling. This talk looks at the value of using the OCI Layout spec, what you can do today, what issues we're facing, and a call to action for more standardization between tooling in this space.\r\n\r\n---\r\n\r\nUnsent\r\n\r\nTo: Jun and Mike and Yan\r\n\r\nI commented on the OpenSFF Stream 8 doc recommending that DIDs be looked at\r\nas a way to exchange vulnerability information.\r\n\r\nWe've been looking potentially at a hybrid DID plus rekor\r\narchitecture (DIDs eventually as a proxy to) \r\n\r\nReferences:\r\n- https://github.com/sigstore/rekor\r\n",
"editedAt": "2022-07-20T15:00:12Z"
},
{
"diff": "## 2022-07-20 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [ ] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [ ] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n- References\r\n - https://static.sched.com/hosted_files/ossna2022/9b/presentation.pdf\r\n\r\n---\r\n\r\nUnsent\r\n\r\nTo: Jun and Mike and Yan\r\n\r\nI commented on the OpenSFF Stream 8 doc recommending that DIDs be looked at\r\nas a way to exchange vulnerability information.\r\n\r\nWe've been looking potentially at a hybrid DID plus rekor\r\narchitecture (DIDs eventually as a proxy to) \r\n\r\nReferences:\r\n- https://github.com/sigstore/rekor\r\n",
"editedAt": "2022-07-20T14:59:25Z"
},
{
"diff": "## 2022-07-20 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [ ] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [ ] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n\r\n---\r\n\r\nUnsent\r\n\r\nTo: Jun and Mike and Yan\r\n\r\nI commented on the OpenSFF Stream 8 doc recommending that DIDs be looked at\r\nas a way to exchange vulnerability information.\r\n\r\nWe've been looking potentially at a hybrid DID plus rekor\r\narchitecture (DIDs eventually as a proxy to) \r\n\r\nReferences:\r\n- https://github.com/sigstore/rekor\r\n",
"editedAt": "2022-07-20T14:56:52Z"
},
{
"diff": "## 2022-07-20 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [ ] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n\r\n---\r\n\r\nUnsent\r\n\r\nTo: Jun and Mike and Yan\r\n\r\nI commented on the OpenSFF Stream 8 doc recommending that DIDs be looked at\r\nas a way to exchange vulnerability information.\r\n\r\nWe've been looking potentially at a hybrid DID plus rekor\r\narchitecture (DIDs eventually as a proxy to) \r\n\r\nReferences:\r\n- https://github.com/sigstore/rekor\r\n",
"editedAt": "2022-07-20T14:53:43Z"
},
{
"diff": "## 2022-07-20 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n\r\n---\r\n\r\nUnsent\r\n\r\nTo: Jun and Mike and Yan\r\n\r\nI commented on the OpenSFF Stream 8 doc recommending that DIDs be looked at\r\nas a way to exchange vulnerability information.\r\n\r\nWe've been looking potentially at a hybrid DID plus rekor\r\narchitecture (DIDs eventually as a proxy to) \r\n\r\nReferences:\r\n- https://github.com/sigstore/rekor\r\n",
"editedAt": "2022-07-20T14:50:43Z"
}
]
}
},
{
"body": "# 2022-07-20 Identifying Security Threats WG\r\n\r\n- Mike leading\r\n- Marta\r\n - Office hours\r\n - Place for Open Source maintainers to be able to ask community of security experts\r\n - Idea is to run first two sessions in August and September\r\n - Proposing two different timeslots to cover all geos\r\n - Question is will be be well staffed for both of those\r\n - She is collecting feedback right now on possibilities for those dates\r\n - Pinging folks who have shown interest in the past\r\n - What format?\r\n - People just show up and ask\r\n - Registration with topic they want to talk about\r\n - Allows us to prepare, consensus is currently we like this\r\n - Can grab right experts beforehand this way\r\n - Reaching out to logistics team for how we can communicate\r\n zoom links, etc.\r\n - Will test registration beginning of August\r\n - Will do doodle poll or something for slots\r\n - Jen is the one for the Zoom setup\r\n - Amir from ostif.org volunteering to answer questions\r\n - May want to do a blog in addition to twitter\r\n - Outreach maybe 4th or 5th, have the twitter points back\r\n to the blog to capture that.\r\n- Meeting time update\r\n - We have been doing this at this time for about a year or so\r\n - We previously alternated between two timeslots for Europe and Asia\r\n - Should we keep this 10 AM Pacific timeslot?\r\n - Alternate between US and APAC friendly timezone\r\n - Most other WGs are morning Pacific time\r\n- Technical Advisory Committee (TAC) update\r\n - They are tasked with making sure we are delivering on our\r\n cohesive promise, part of that is visuabliity and transparency\r\n into the work that we do.\r\n - We now have a formal reporting process\r\n - It's not a periodic we're all invited to show up to the TAC meeting\r\n one slide per project.\r\n - What we're doing\r\n - Why we're doing it\r\n - It's meant as an FYI, we are not asking for approval, we're letting them\r\n know what we're up to.\r\n - Everyone who is driving a process or project or thing, please send Mike\r\n a single slide, what is it, why are we doing it, what the status is,\r\n what's coming next, and if you need anything\r\n - Christine on metrics\r\n - Luigi for SECURITY-INSIGHTS.yml`\r\n - Mike will send out a template\r\n - Please fill and respond by Monday\r\n - Mike says the metrics work should live under a working group, maybe this one, maybe best practices\r\n - CRob might have an opinion here, as long as work gets done\r\n - As an org OpenSSF would benefit by being less siloed\r\n - Question on if we should align to streams?\r\n - LFX specific definition of metrics in mobilization paper\r\n - AR for Christine to sync with CRob and see what he thinks.\r\n - Will raise with TAC next week.\r\n- A few action items for metrics from Christine\r\n - Working groups are adopting streams from the mobilization plans\r\n- Mike: Alpha Omega\r\n - A few people were on the public call earlier\r\n - The recording will be on YouTube\r\n - Mike will give the fast version of the presentation right now\r\n - They are still hiring\r\n - Exploring ways of allocating headcount other than direct hiring\r\n - If you know anyone or are interested please apply or ping them!\r\n - Alpha\r\n - Announced Node, Python, Eclipse\r\n - Omega\r\n - Toolchain is pending\r\n - Waiting for legal approval due to the way the license for CodeQL works\r\n - Had a CVE in Node that got fixed earlier this month\r\n - RCE in JSHint that was bitrotted (unused) we removed\r\n - Two CVEs discloudsed yetserday and two more in the works (couple weeks to release ETA)\r\n - Found NodeJS vuln via systemcall tracing\r\n - It tires to query `openssl.cnf` and dumps strace logs to a repo\r\n - You then have a one stop show of show me every link package of when a binary starts, it does a DNS query\r\n - John: Sounds aligned with Alice's goals\r\n - https://sos.dev coming under Alpha-Omega\r\n - Allows us to compensate dev directly\r\n - How to participate\r\n - Improve security tools\r\n - https://sos.dev\r\n - Join working groups\r\n - Get on slack\r\n- Amir: Security Reviews\r\n - Repo is looking good\r\n - Updating with four new audits that ostif.org published last week\r\n - At almost 100 reviews from Mike (Omega work), ostif.org, and community\r\n - We're gaining traction, getting good stuff in there all the time\r\n - Might need some help with the automated testing that get's done\r\n when we upload reviews.\r\n - Feedback always welcome.\r\n- John: Collection of metric / Alpha-Omega data into shared DB\r\n - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice\r\n - https://datatracker.ietf.org/doc/html/draft-birkholz-scitt-architecture\r\n - https://www.w3.org/2022/07/pressrelease-did-rec.html.en\r\n - https://docs.microsoft.com/en-us/azure/confidential-ledger/architecture\r\n - Mike\r\n - Mike has been thinking about SCITT as a schema and rules on how one would assert facts, weither it's confidential compute or traditional permissions is impelmenetation details.\r\n - If metircs runs across you're repo and you have 30 contributors, great\r\n - As consumer, how can I discover that fact and trust that it's accruate\r\n - Could immaiget a world where things like Scorecard express the data as as SCITT assursion\r\n - You go and query that store and you say tell me everythig you know about foo and you get it all back\r\n - Until we have an implementation with WEb5 that's at at least beta, we could expore what that looks like.\r\n - John: We can do rekor for now, we'll bridge it all later target 1-2 years out\r\n - John: We have alignment. Time to execute. rekor + sigstore for metric data atteststation signed with github odic tokens. We care about data provenance. We will later bridge into web5 space used as central points of comms given DID as effectively the URL or the future. This is in realtion to what we talked to Melvin about with data provenance. We need to start planning how we are going to build up this space now so we can have provenance on thoughts later. This provenance could be for example on inference derived from provenance from training data and model training env and config. This will allow us to ensure the prioritizer make decisions based on Sprit of the law / aka intent based policy derived from Trinity of Static Analysis, Dynamic Analysis, and Human Intent.\r\n - Living Threat Model threats, mitigations, trust boundaries as initial data set for cross domain conceptual mapping of the the trinity to build pyramid of thought alignment to strategic principles.\r\n - One of our strategic plans / principles says: \"We must be able to trust the sources of all input data used for all model training was done from research studies with these ethical certifications\"\r\n - This allows us to write policies (Open Policy Agent to JSON to DID/VC/SCITT translation/application exploration still in progress) for the organizations we form and apply them as overlays to flows we execute where context appropriate. These overlaid flows define the trusted parties within that context as applicable to the active organizational policies as applicable to the top level system context.\r\n - The policy associated with the principle that consumes the overlaid trust attestations we will implement and LTM auditor for which checks the SCITT provenance information associated with the operation implementations and the operation implementation network, input network, etc. within the orchestrators trust boundary (TODO need to track usages / `reuse` of contexts `ictx`, `nctx`, etc. with something predeclared, aka at runtime if your `Operation` data structure doesn't allowlist your usage of it you can pass it to a subflow for reuse. This allows us to use the format within our orchrestration and for static analysis because we can use this same format to describe the trust boundry proeprties that other domain sepcific represenatations of architecture have, for instance we could if we were doing and Open Architecture (OA) Intermediate Representation (IR) for and ELF file we might note that the input network context is not reused from the top level system context. Where as if we did an OA IR for Python code we would say that the input network is reused from the top level system context (it has access to that memory region, whereas when you launch and ELF you look access to the parents memory region, typically).\r\n - Christine\r\n - Looking at trying to connect all the different data sources\r\n- References\r\n - [Meeting Notes](https://docs.google.com/document/d/1AfI0S6VjBCO0ZkULCYZGHuzzW8TPqO3zYxRjzmKvUB4/edit?usp=sharing)\r\n - [GitHub Workgroup Page](https://github.com/ossf/wg-identifying-security-threats)\r\n - [OpenSSF Slack](https://slack.openssf.org)\r\n - [Metric Dashboard](https://metrics.openssf.org)\r\n- TODO\r\n - @pdxjohnny\r\n - [ ] Reach out to Christine about metrics collaboration\r\n - [ ] Respond with slides for Mike if he asks",
"createdAt": "2022-07-20T17:09:05Z",
"userContentEdits": {
"pageInfo": {
"endCursor": "Y3Vyc29yOnYyOpHOABNyKg==",
"hasNextPage": false
},
"totalCount": 22,
"nodes": [
{
"diff": "# 2022-07-20 Identifying Security Threats WG\r\n\r\n- Mike leading\r\n- Marta\r\n - Office hours\r\n - Place for Open Source maintainers to be able to ask community of security experts\r\n - Idea is to run first two sessions in August and September\r\n - Proposing two different timeslots to cover all geos\r\n - Question is will be be well staffed for both of those\r\n - She is collecting feedback right now on possibilities for those dates\r\n - Pinging folks who have shown interest in the past\r\n - What format?\r\n - People just show up and ask\r\n - Registration with topic they want to talk about\r\n - Allows us to prepare, consensus is currently we like this\r\n - Can grab right experts beforehand this way\r\n - Reaching out to logistics team for how we can communicate\r\n zoom links, etc.\r\n - Will test registration beginning of August\r\n - Will do doodle poll or something for slots\r\n - Jen is the one for the Zoom setup\r\n - Amir from ostif.org volunteering to answer questions\r\n - May want to do a blog in addition to twitter\r\n - Outreach maybe 4th or 5th, have the twitter points back\r\n to the blog to capture that.\r\n- Meeting time update\r\n - We have been doing this at this time for about a year or so\r\n - We previously alternated between two timeslots for Europe and Asia\r\n - Should we keep this 10 AM Pacific timeslot?\r\n - Alternate between US and APAC friendly timezone\r\n - Most other WGs are morning Pacific time\r\n- Technical Advisory Committee (TAC) update\r\n - They are tasked with making sure we are delivering on our\r\n cohesive promise, part of that is visuabliity and transparency\r\n into the work that we do.\r\n - We now have a formal reporting process\r\n - It's not a periodic we're all invited to show up to the TAC meeting\r\n one slide per project.\r\n - What we're doing\r\n - Why we're doing it\r\n - It's meant as an FYI, we are not asking for approval, we're letting them\r\n know what we're up to.\r\n - Everyone who is driving a process or project or thing, please send Mike\r\n a single slide, what is it, why are we doing it, what the status is,\r\n what's coming next, and if you need anything\r\n - Christine on metrics\r\n - Luigi for SECURITY-INSIGHTS.yml`\r\n - Mike will send out a template\r\n - Please fill and respond by Monday\r\n - Mike says the metrics work should live under a working group, maybe this one, maybe best practices\r\n - CRob might have an opinion here, as long as work gets done\r\n - As an org OpenSSF would benefit by being less siloed\r\n - Question on if we should align to streams?\r\n - LFX specific definition of metrics in mobilization paper\r\n - AR for Christine to sync with CRob and see what he thinks.\r\n - Will raise with TAC next week.\r\n- A few action items for metrics from Christine\r\n - Working groups are adopting streams from the mobilization plans\r\n- Mike: Alpha Omega\r\n - A few people were on the public call earlier\r\n - The recording will be on YouTube\r\n - Mike will give the fast version of the presentation right now\r\n - They are still hiring\r\n - Exploring ways of allocating headcount other than direct hiring\r\n - If you know anyone or are interested please apply or ping them!\r\n - Alpha\r\n - Announced Node, Python, Eclipse\r\n - Omega\r\n - Toolchain is pending\r\n - Waiting for legal approval due to the way the license for CodeQL works\r\n - Had a CVE in Node that got fixed earlier this month\r\n - RCE in JSHint that was bitrotted (unused) we removed\r\n - Two CVEs discloudsed yetserday and two more in the works (couple weeks to release ETA)\r\n - Found NodeJS vuln via systemcall tracing\r\n - It tires to query `openssl.cnf` and dumps strace logs to a repo\r\n - You then have a one stop show of show me every link package of when a binary starts, it does a DNS query\r\n - John: Sounds aligned with Alice's goals\r\n - https://sos.dev coming under Alpha-Omega\r\n - Allows us to compensate dev directly\r\n - How to participate\r\n - Improve security tools\r\n - https://sos.dev\r\n - Join working groups\r\n - Get on slack\r\n- Amir: Security Reviews\r\n - Repo is looking good\r\n - Updating with four new audits that ostif.org published last week\r\n - At almost 100 reviews from Mike (Omega work), ostif.org, and community\r\n - We're gaining traction, getting good stuff in there all the time\r\n - Might need some help with the automated testing that get's done\r\n when we upload reviews.\r\n - Feedback always welcome.\r\n- John: Collection of metric / Alpha-Omega data into shared DB\r\n - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice\r\n - https://datatracker.ietf.org/doc/html/draft-birkholz-scitt-architecture\r\n - https://www.w3.org/2022/07/pressrelease-did-rec.html.en\r\n - https://docs.microsoft.com/en-us/azure/confidential-ledger/architecture\r\n - Mike\r\n - Mike has been thinking about SCITT as a schema and rules on how one would assert facts, weither it's confidential compute or traditional permissions is impelmenetation details.\r\n - If metircs runs across you're repo and you have 30 contributors, great\r\n - As consumer, how can I discover that fact and trust that it's accruate\r\n - Could immaiget a world where things like Scorecard express the data as as SCITT assursion\r\n - You go and query that store and you say tell me everythig you know about foo and you get it all back\r\n - Until we have an implementation with WEb5 that's at at least beta, we could expore what that looks like.\r\n - John: We can do rekor for now, we'll bridge it all later target 1-2 years out\r\n - John: We have alignment. Time to execute. rekor + sigstore for metric data atteststation signed with github odic tokens. We care about data provenance. We will later bridge into web5 space used as central points of comms given DID as effectively the URL or the future. This is in realtion to what we talked to Melvin about with data provenance. We need to start planning how we are going to build up this space now so we can have provenance on thoughts later. This provenance could be for example on inference derived from provenance from training data and model training env and config. This will allow us to ensure the prioritizer make decisions based on Sprit of the law / aka intent based policy derived from Trinity of Static Analysis, Dynamic Analysis, and Human Intent.\r\n - Living Threat Model threats, mitigations, trust boundaries as initial data set for cross domain conceptual mapping of the the trinity to build pyramid of thought alignment to strategic principles.\r\n - One of our strategic plans / principles says: \"We must be able to trust the sources of all input data used for all model training was done from research studies with these ethical certifications\"\r\n - This allows us to write policies (Open Policy Agent to JSON to DID/VC/SCITT translation/application exploration still in progress) for the organizations we form and apply them as overlays to flows we execute where context appropriate. These overlaid flows define the trusted parties within that context as applicable to the active organizational policies as applicable to the top level system context.\r\n - The policy associated with the principle that consumes the overlaid trust attestations we will implement and LTM auditor for which checks the SCITT provenance information associated with the operation implementations and the operation implementation network, input network, etc. within the orchestrators trust boundary (TODO need to track usages / `reuse` of contexts `ictx`, `nctx`, etc. with something predeclared, aka at runtime if your `Operation` data structure doesn't allowlist your usage of it you can pass it to a subflow for reuse. This allows us to use the format within our orchrestration and for static analysis because we can use this same format to describe the trust boundry proeprties that other domain sepcific represenatations of architecture have, for instance we could if we were doing and Open Architecture (OA) Intermediate Representation (IR) for and ELF file we might note that the input network context is not reused from the top level system context. Where as if we did an OA IR for Python code we would say that the input network is reused from the top level system context (it has access to that memory region, whereas when you launch and ELF you look access to the parents memory region, typically).\r\n - Christine\r\n - Looking at trying to connect all the different data sources\r\n- References\r\n - [Meeting Notes](https://docs.google.com/document/d/1AfI0S6VjBCO0ZkULCYZGHuzzW8TPqO3zYxRjzmKvUB4/edit?usp=sharing)\r\n - [GitHub Workgroup Page](https://github.com/ossf/wg-identifying-security-threats)\r\n - [OpenSSF Slack](https://slack.openssf.org)\r\n - [Metric Dashboard](https://metrics.openssf.org)\r\n- TODO\r\n - @pdxjohnny\r\n - [ ] Reach out to Christine about metrics collaboration\r\n - [ ] Respond with slides for Mike if he asks",
"editedAt": "2022-09-28T17:01:02Z"
},
{
"diff": "# 2022-07-20 Identifying Security Threats WG\r\n\r\n- Mike leading\r\n- Marta\r\n - Office hours\r\n - Place for Open Source maintainers to be able to ask community of security experts\r\n - Idea is to run first two sessions in August and September\r\n - Proposing two different timeslots to cover all geos\r\n - Question is will be be well staffed for both of those\r\n - She is collecting feedback right now on possibilities for those dates\r\n - Pinging folks who have shown interest in the past\r\n - What format?\r\n - People just show up and ask\r\n - Registration with topic they want to talk about\r\n - Allows us to prepare, consensus is currently we like this\r\n - Can grab right experts beforehand this way\r\n - Reaching out to logistics team for how we can communicate\r\n zoom links, etc.\r\n - Will test registration beginning of August\r\n - Will do doodle poll or something for slots\r\n - Jen is the one for the Zoom setup\r\n - Amir from ostif.org volunteering to answer questions\r\n - May want to do a blog in addition to twitter\r\n - Outreach maybe 4th or 5th, have the twitter points back\r\n to the blog to capture that.\r\n- Meeting time update\r\n - We have been doing this at this time for about a year or so\r\n - We previously alternated between two timeslots for Europe and Asia\r\n - Should we keep this 10 AM Pacific timeslot?\r\n - Alternate between US and APAC friendly timezone\r\n - Most other WGs are morning Pacific time\r\n- Technical Advisory Committee (TAC) update\r\n - They are tasked with making sure we are delivering on our\r\n cohesive promise, part of that is visuabliity and transparency\r\n into the work that we do.\r\n - We now have a formal reporting process\r\n - It's not a periodic we're all invited to show up to the TAC meeting\r\n one slide per project.\r\n - What we're doing\r\n - Why we're doing it\r\n - It's meant as an FYI, we are not asking for approval, we're letting them\r\n know what we're up to.\r\n - Everyone who is driving a process or project or thing, please send Mike\r\n a single slide, what is it, why are we doing it, what the status is,\r\n what's coming next, and if you need anything\r\n - Christine on metrics\r\n - Luigi for SECURITY-INSIGHTS.yml`\r\n - Mike will send out a template\r\n - Please fill and respond by Monday\r\n - Mike says the metrics work should live under a working group, maybe this one, maybe best practices\r\n - CRob might have an opinion here, as long as work gets done\r\n - As an org OpenSSF would benefit by being less siloed\r\n - Question on if we should align to streams?\r\n - LFX specific definition of metrics in mobilization paper\r\n - AR for Christine to sync with CRob and see what he thinks.\r\n - Will raise with TAC next week.\r\n- A few action items for metrics from Christine\r\n - Working groups are adopting streams from the mobilization plans\r\n- Mike: Alpha Omega\r\n - A few people were on the public call earlier\r\n - The recording will be on YouTube\r\n - Mike will give the fast version of the presentation right now\r\n - They are still hiring\r\n - Exploring ways of allocating headcount other than direct hiring\r\n - If you know anyone or are interested please apply or ping them!\r\n - Alpha\r\n - Announced Node, Python, Eclipse\r\n - Omega\r\n - Toolchain is pending\r\n - Waiting for legal approval due to the way the license for CodeQL works\r\n - Had a CVE in Node that got fixed earlier this month\r\n - RCE in JSHint that was bitrotted (unused) we removed\r\n - Two CVEs discloudsed yetserday and two more in the works (couple weeks to release ETA)\r\n - Found NodeJS vuln via systemcall tracing\r\n - It tires to query `openssl.cnf` and dumps strace logs to a repo\r\n - You then have a one stop show of show me every link package of when a binary starts, it does a DNS query\r\n - John: Sounds aligned with Alice's goals\r\n - https://sos.dev coming under Alpha-Omega\r\n - Allows us to compensate dev directly\r\n - How to participate\r\n - Improve security tools\r\n - https://sos.dev\r\n - Join working groups\r\n - Get on slack\r\n- Amir: Security Reviews\r\n - Repo is looking good\r\n - Updating with four new audits that ostif.org published last week\r\n - At almost 100 reviews from Mike (Omega work), ostif.org, and community\r\n - We're gaining traction, getting good stuff in there all the time\r\n - Might need some help with the automated testing that get's done\r\n when we upload reviews.\r\n - Feedback always welcome.\r\n- John: Collection of metric / Alpha-Omega data into shared DB\r\n - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice\r\n - https://datatracker.ietf.org/doc/html/draft-birkholz-scitt-architecture\r\n - https://www.w3.org/2022/07/pressrelease-did-rec.html.en\r\n - https://docs.microsoft.com/en-us/azure/confidential-ledger/architecture\r\n - Mike\r\n - Mike has been thinking about SCITT as a schema and rules on how one would assert facts, weither it's confidential compute or traditional permissions is impelmenetation details.\r\n - If metircs runs across you're repo and you have 30 contributors, great\r\n - As consumer, how can I discover that fact and trust that it's accruate\r\n - Could immaiget a world where things like Scorecard express the data as as SCITT assursion\r\n - You go and query that store and you say tell me everythig you know about foo and you get it all back\r\n - Until we have an implementation with WEb5 that's at at least beta, we could expore what that looks like.\r\n - John: We can do rekor for now, we'll bridge it all later target 1-2 years out\r\n - John: We have alignment. Time to execute. rekor + sigstore for metric data atteststation signed with github odic tokens. We care about data provenance. We will later bridge into web5 space used as central points of comms given DID as effectively the URL or the future. This is in realtion to what we talked to Melvin about with data provenance. We need to start planning how we are going to build up this space now so we can have provenance on thoughts later. This provenance could be for example on inference derived from provenance from training data and model training env and config. This will allow us to ensure the prioritizer make decisions based on Sprit of the law / aka intent based policy derived from Trinity of Static Analysis, Dynamic Analysis, and Human Intent.\r\n - Living Threat Model threats, mitigations, trust boundaries as initial data set for cross domain conceptual mapping of the the trinity to build pyramid of thought alignment to strategic principles.\r\n - One of our strategic plans / principles says: \"We must be able to trust the sources of all input data used for all model training was done from research studies with these ethical certifications\"\r\n - This allows us to write policies (Open Policy Agent to JSON to DID/VC/SCITT translation/application exploration still in progress) for the organizations we form and apply them as overlays to flows we execute where context appropriate. These overlaid flows define the trusted parties within that context as applicable to the active organizational policies as applicable to the top level system context.\r\n - The policy associated with the principle that consumes the overlaid trust attestations we will implement and LTM auditor for which checks the SCITT provenance information associated with the operation implementations and the operation implementation network, input network, etc. within the orchestrators trust boundary (TODO need to track usages / `reuse` of contexts `ictx`, `nctx`, etc. with something predeclared, aka at runtime if your `Operation` data structure doesn't allowlist your usage of it you can pass it to a subflow for reuse. This allows us to use the format within our orchrestration and for static analysis because we can use this same format to describe the trust boundry proeprties that other domain sepcific represenatations of architecture have, for instance we could if we were doing and Open Architecture (OA) Intermediate Representation (IR) for and ELF file we might note that the input network context is not reused from the top level system context. Where as if we did an OA IR for Python code we would say that the input network is reused from the top level system context (it has access to that memory region, whereas when you launch and ELF you look access to the parents memory region, typically).\r\n - Christine\r\n - Looking at trying to connect all the different data sources\r\n- References\r\n - [Meeting Notes](https://docs.google.com/document/d/1AfI0S6VjBCO0ZkULCYZGHuzzW8TPqO3zYxRjzmKvUB4/edit?usp=sharing)\r\n - [GitHub Workgroup Page](https://www.google.com/url?q=https%3A%2F%2Fgithub.com%2Fossf%2Fwg-identifying-security-threats&sa=D&ust=1649124843387000&usg=AOvVaw0GZyFGR88kDmGyacUU0opq)\r\n - [OpenSSF Slack](https://www.google.com/url?q=https%3A%2F%2Fslack.openssf.org%2F&sa=D&ust=1649124843387000&usg=AOvVaw2o1O1sSQ15LmhM4jfZyZ0D)\r\n - [Metric Dashboard](https://www.google.com/url?q=https%3A%2F%2Fmetrics.openssf.org%2F&sa=D&ust=1649124843387000&usg=AOvVaw2iZFHEQevKr-uG2kHT5kow)\r\n- TODO\r\n - @pdxjohnny\r\n - [ ] Reach out to Christine about metrics collaboration\r\n - [ ] Respond with slides for Mike if he asks",
"editedAt": "2022-07-22T18:41:24Z"
},
{
"diff": "# 2022-07-20 Identifying Security Threats WG\r\n\r\n- Mike leading\r\n- Marta\r\n - Office hours\r\n - Place for Open Source maintainers to be able to ask community of security experts\r\n - Idea is to run first two sessions in August and September\r\n - Proposing two different timeslots to cover all geos\r\n - Question is will be be well staffed for both of those\r\n - She is collecting feedback right now on possibilities for those dates\r\n - Pinging folks who have shown interest in the past\r\n - What format?\r\n - People just show up and ask\r\n - Registration with topic they want to talk about\r\n - Allows us to prepare, consensus is currently we like this\r\n - Can grab right experts beforehand this way\r\n - Reaching out to logistics team for how we can communicate\r\n zoom links, etc.\r\n - Will test registration beginning of August\r\n - Will do doodle poll or something for slots\r\n - Jen is the one for the Zoom setup\r\n - Amir from ostif.org volunteering to answer questions\r\n - May want to do a blog in addition to twitter\r\n - Outreach maybe 4th or 5th, have the twitter points back\r\n to the blog to capture that.\r\n- Meeting time update\r\n - We have been doing this at this time for about a year or so\r\n - We previously alternated between two timeslots for Europe and Asia\r\n - Should we keep this 10 AM Pacific timeslot?\r\n - Alternate between US and APAC friendly timezone\r\n - Most other WGs are morning Pacific time\r\n- Technical Advisory Committee (TAC) update\r\n - They are tasked with making sure we are delivering on our\r\n cohesive promise, part of that is visuabliity and transparency\r\n into the work that we do.\r\n - We now have a formal reporting process\r\n - It's not a periodic we're all invited to show up to the TAC meeting\r\n one slide per project.\r\n - What we're doing\r\n - Why we're doing it\r\n - It's meant as an FYI, we are not asking for approval, we're letting them\r\n know what we're up to.\r\n - Everyone who is driving a process or project or thing, please send Mike\r\n a single slide, what is it, why are we doing it, what the status is,\r\n what's coming next, and if you need anything\r\n - Christine on metrics\r\n - Luigi for SECURITY-INSIGHTS.yml`\r\n - Mike will send out a template\r\n - Please fill and respond by Monday\r\n - Mike says the metrics work should live under a working group, maybe this one, maybe best practices\r\n - CRob might have an opinion here, as long as work gets done\r\n - As an org OpenSSF would benefit by being less siloed\r\n - Question on if we should align to streams?\r\n - LFX specific definition of metrics in mobilization paper\r\n - AR to sync with CRob and see what he thinks.\r\n - Will raise with TAC next week.\r\n- A few action items for metrics from Christine\r\n - Working groups are adopting streams from the mobilization plans\r\n- Mike: Alpha Omega\r\n - A few people were on the public call earlier\r\n - The recording will be on YouTube\r\n - Mike will give the fast version of the presentation right now\r\n - They are still hiring\r\n - Exploring ways of allocating headcount other than direct hiring\r\n - If you know anyone or are interested please apply or ping them!\r\n - Alpha\r\n - Announced Node, Python, Eclipse\r\n - Omega\r\n - Toolchain is pending\r\n - Waiting for legal approval due to the way the license for CodeQL works\r\n - Had a CVE in Node that got fixed earlier this month\r\n - RCE in JSHint that was bitrotted (unused) we removed\r\n - Two CVEs discloudsed yetserday and two more in the works (couple weeks to release ETA)\r\n - Found NodeJS vuln via systemcall tracing\r\n - It tires to query `openssl.cnf` and dumps strace logs to a repo\r\n - You then have a one stop show of show me every link package of when a binary starts, it does a DNS query\r\n - John: Sounds aligned with Alice's goals\r\n - https://sos.dev coming under Alpha-Omega\r\n - Allows us to compensate dev directly\r\n - How to participate\r\n - Improve security tools\r\n - https://sos.dev\r\n - Join working groups\r\n - Get on slack\r\n- Amir: Security Reviews\r\n - Repo is looking good\r\n - Updating with four new audits that ostif.org published last week\r\n - At almost 100 reviews from Mike (Omega work), ostif.org, and community\r\n - We're gaining traction, getting good stuff in there all the time\r\n - Might need some help with the automated testing that get's done\r\n when we upload reviews.\r\n - Feedback always welcome.\r\n- John: Collection of metric / Alpha-Omega data into shared DB\r\n - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice\r\n - https://datatracker.ietf.org/doc/html/draft-birkholz-scitt-architecture\r\n - https://www.w3.org/2022/07/pressrelease-did-rec.html.en\r\n - https://docs.microsoft.com/en-us/azure/confidential-ledger/architecture\r\n - Mike\r\n - Mike has been thinking about SCITT as a schema and rules on how one would assert facts, weither it's confidential compute or traditional permissions is impelmenetation details.\r\n - If metircs runs across you're repo and you have 30 contributors, great\r\n - As consumer, how can I discover that fact and trust that it's accruate\r\n - Could immaiget a world where things like Scorecard express the data as as SCITT assursion\r\n - You go and query that store and you say tell me everythig you know about foo and you get it all back\r\n - Until we have an implementation with WEb5 that's at at least beta, we could expore what that looks like.\r\n - John: We can do rekor for now, we'll bridge it all later target 1-2 years out\r\n - John: We have alignment. Time to execute. rekor + sigstore for metric data atteststation signed with github odic tokens. We care about data provenance. We will later bridge into web5 space used as central points of comms given DID as effectively the URL or the future. This is in realtion to what we talked to Melvin about with data provenance. We need to start planning how we are going to build up this space now so we can have provenance on thoughts later. This provenance could be for example on inference derived from provenance from training data and model training env and config. This will allow us to ensure the prioritizer make decisions based on Sprit of the law / aka intent based policy derived from Trinity of Static Analysis, Dynamic Analysis, and Human Intent.\r\n - Living Threat Model threats, mitigations, trust boundaries as initial data set for cross domain conceptual mapping of the the trinity to build pyramid of thought alignment to strategic principles.\r\n - One of our strategic plans / principles says: \"We must be able to trust the sources of all input data used for all model training was done from research studies with these ethical certifications\"\r\n - This allows us to write policies (Open Policy Agent to JSON to DID/VC/SCITT translation/application exploration still in progress) for the organizations we form and apply them as overlays to flows we execute where context appropriate. These overlaid flows define the trusted parties within that context as applicable to the active organizational policies as applicable to the top level system context.\r\n - The policy associated with the principle that consumes the overlaid trust attestations we will implement and LTM auditor for which checks the SCITT provenance information associated with the operation implementations and the operation implementation network, input network, etc. within the orchestrators trust boundary (TODO need to track usages / `reuse` of contexts `ictx`, `nctx`, etc. with something predeclared, aka at runtime if your `Operation` data structure doesn't allowlist your usage of it you can pass it to a subflow for reuse. This allows us to use the format within our orchrestration and for static analysis because we can use this same format to describe the trust boundry proeprties that other domain sepcific represenatations of architecture have, for instance we could if we were doing and Open Architecture (OA) Intermediate Representation (IR) for and ELF file we might note that the input network context is not reused from the top level system context. Where as if we did an OA IR for Python code we would say that the input network is reused from the top level system context (it has access to that memory region, whereas when you launch and ELF you look access to the parents memory region, typically).\r\n - Christine\r\n - Looking at trying to connect all the different data sources\r\n- References\r\n - [Meeting Notes](https://docs.google.com/document/d/1AfI0S6VjBCO0ZkULCYZGHuzzW8TPqO3zYxRjzmKvUB4/edit?usp=sharing)\r\n - [GitHub Workgroup Page](https://www.google.com/url?q=https%3A%2F%2Fgithub.com%2Fossf%2Fwg-identifying-security-threats&sa=D&ust=1649124843387000&usg=AOvVaw0GZyFGR88kDmGyacUU0opq)\r\n - [OpenSSF Slack](https://www.google.com/url?q=https%3A%2F%2Fslack.openssf.org%2F&sa=D&ust=1649124843387000&usg=AOvVaw2o1O1sSQ15LmhM4jfZyZ0D)\r\n - [Metric Dashboard](https://www.google.com/url?q=https%3A%2F%2Fmetrics.openssf.org%2F&sa=D&ust=1649124843387000&usg=AOvVaw2iZFHEQevKr-uG2kHT5kow)\r\n- TODO\r\n - @pdxjohnny\r\n - [ ] Reach out to Christine about metrics collaboration\r\n - [ ] Respond with slides for Mike if he asks",
"editedAt": "2022-07-22T18:40:12Z"
},
{
"diff": "# 2022-07-20 Identifying Security Threats WG\r\n\r\n- Mike leading\r\n- Marta\r\n - Office hours\r\n - Place for Open Source maintainers to be able to ask community of security experts\r\n - Idea is to run first two sessions in August and September\r\n - Proposing two different timeslots to cover all geos\r\n - Question is will be be well staffed for both of those\r\n - She is collecting feedback right now on possibilities for those dates\r\n - Pinging folks who have shown interest in the past\r\n - What format?\r\n - People just show up and ask\r\n - Registration with topic they want to talk about\r\n - Allows us to prepare, consensus is currently we like this\r\n - Can grab right experts beforehand this way\r\n - Reaching out to logistics team for how we can communicate\r\n zoom links, etc.\r\n - Will test registration beginning of August\r\n - Will do doodle poll or something for slots\r\n - Jen is the one for the Zoom setup\r\n - Amir from ostif.org volunteering to answer questions\r\n - May want to do a blog in addition to twitter\r\n - Outreach maybe 4th or 5th, have the twitter points back\r\n to the blog to capture that.\r\n- Meeting time update\r\n - We have been doing this at this time for about a year or so\r\n - We previously alternated between two timeslots for Europe and Asia\r\n - Should we keep this 10 AM Pacific timeslot?\r\n - Alternate between US and APAC friendly timezone\r\n - Most other WGs are morning Pacific time\r\n- Technical Advisory Committee (TAC) update\r\n - They are tasked with making sure we are delivering on our\r\n cohesive promise, part of that is visuabliity and transparency\r\n into the work that we do.\r\n - We now have a formal reporting process\r\n - It's not a periodic we're all invited to show up to the TAC meeting\r\n one slide per project.\r\n - What we're doing\r\n - Why we're doing it\r\n - It's meant as an FYI, we are not asking for approval, we're letting them\r\n know what we're up to.\r\n - Everyone who is driving a process or project or thing, please send Mike\r\n a single slide, what is it, why are we doing it, what the status is,\r\n what's coming next, and if you need anything\r\n - Christine on metrics\r\n - Luigi for SECURITY-INSIGHTS.yml`\r\n - Mike will send out a template\r\n - Please fill and respond by Monday\r\n - Mike says the metrics work should live under a working group, maybe this one, maybe best practices\r\n - CRob might have an opinion here, as long as work gets done\r\n - As an org OpenSSF would benefit by being less siloed\r\n - Question on if we should align to streams?\r\n - LFX specific definition of metrics in mobilization paper\r\n - AR to sync with CRob and see what he thinks.\r\n - Will raise with TAC next week.\r\n- A few action items for metrics from Christine\r\n - Working groups are adopting streams from the mobilization plans\r\n- Mike: Alpha Omega\r\n - A few people were on the public call earlier\r\n - The recording will be on YouTube\r\n - Mike will give the fast version of the presentation right now\r\n - They are still hiring\r\n - Exploring ways of allocating headcount other than direct hiring\r\n - If you know anyone or are interested please apply or ping them!\r\n - Alpha\r\n - Announced Node, Python, Eclipse\r\n - Omega\r\n - Toolchain is pending\r\n - Waiting for legal approval due to the way the license for CodeQL works\r\n - Had a CVE in Node that got fixed earlier this month\r\n - RCE in JSHint that was bitrotted (unused) we removed\r\n - Two CVEs discloudsed yetserday and two more in the works (couple weeks to release ETA)\r\n - Found NodeJS vuln via systemcall tracing\r\n - It tires to query `openssl.cnf` and dumps strace logs to a repo\r\n - You then have a one stop show of show me every link package of when a binary starts, it does a DNS query\r\n - John: Sounds aligned with Alice's goals\r\n - https://sos.dev coming under Alpha-Omega\r\n - Allows us to compensate dev directly\r\n - How to participate\r\n - Improve security tools\r\n - https://sos.dev\r\n - Join working groups\r\n - Get on slack\r\n- Amir: Security Reviews\r\n - Repo is looking good\r\n - Updating with four new audits that ostif.org published last week\r\n - At almost 100 reviews from Mike (Omega work), ostif.org, and community\r\n - We're gaining traction, getting good stuff in there all the time\r\n - Might need some help with the automated testing that get's done\r\n when we upload reviews.\r\n - Feedback always welcome.\r\n- John: Collection of metric / Alpha-Omega data into shared DB\r\n - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice\r\n - https://datatracker.ietf.org/doc/html/draft-birkholz-scitt-architecture\r\n - https://www.w3.org/2022/07/pressrelease-did-rec.html.en\r\n - https://docs.microsoft.com/en-us/azure/confidential-ledger/architecture\r\n - Mike\r\n - Mike has been thinking about SCITT as a schema and rules on how one would assert facts, weither it's confidential compute or traditional permissions is impelmenetation details.\r\n - If metircs runs across you're repo and you have 30 contributors, great\r\n - As consumer, how can I discover that fact and trust that it's accruate\r\n - Could immaiget a world where things like Scorecard express the data as as SCITT assursion\r\n - You go and query that store and you say tell me everythig you know about foo and you get it all back\r\n - Until we have an implementation with WEb5 that's at at least beta, we could expore what that looks like.\r\n - John: We can do rekor for now, we'll bridge it all later target 1-2 years out\r\n - We have alignment. Time to execute. rekor + sigstore for metric data atteststation signed with github odic tokens. We care about data provenance. We will later bridge into web5 space used as central points of comms given DID as effectively the URL or the future. This is in realtion to what we talked to Melvin about with data provenance. We need to start planning how we are going to build up this space now so we can have provenance on thoughts later. This provenance could be for example on inference derived from provenance from training data and model training env and config. This will allow us to ensure the prioritizer make decisions based on Sprit of the law / aka intent based policy derived from Trinity of Static Analysis, Dynamic Analysis, and Human Intent.\r\n - Living Threat Model threats, mitigations, trust boundaries as initial data set for cross domain conceptual mapping of the the trinity to build pyramid of thought alignment to strategic principles.\r\n - One of our strategic plans / principles says: \"We must be able to trust the sources of all input data used for all model training was done from research studies with these ethical certifications\"\r\n - This allows us to write policies (Open Policy Agent to JSON to DID/VC/SCITT translation/application exploration still in progress) for the organizations we form and apply them as overlays to flows we execute where context appropriate. These overlaid flows define the trusted parties within that context as applicable to the active organizational policies as applicable to the top level system context.\r\n - The policy associated with the principle that consumes the overlaid trust attestations we will implement and LTM auditor for which checks the SCITT provenance information associated with the operation implementations and the operation implementation network, input network, etc. within the orchestrators trust boundary (TODO need to track usages / `reuse` of contexts `ictx`, `nctx`, etc. with something predeclared, aka at runtime if your `Operation` data structure doesn't allowlist your usage of it you can pass it to a subflow for reuse. This allows us to use the format within our orchrestration and for static analysis because we can use this same format to describe the trust boundry proeprties that other domain sepcific represenatations of architecture have, for instance we could if we were doing and Open Architecture (OA) Intermediate Representation (IR) for and ELF file we might note that the input network context is not reused from the top level system context. Where as if we did an OA IR for Python code we would say that the input network is reused from the top level system context (it has access to that memory region, whereas when you launch and ELF you look access to the parents memory region, typically).\r\n - Christine\r\n - Looking at trying to connect all the different data sources\r\n- References\r\n - [Meeting Notes](https://docs.google.com/document/d/1AfI0S6VjBCO0ZkULCYZGHuzzW8TPqO3zYxRjzmKvUB4/edit?usp=sharing)\r\n - [GitHub Workgroup Page](https://www.google.com/url?q=https%3A%2F%2Fgithub.com%2Fossf%2Fwg-identifying-security-threats&sa=D&ust=1649124843387000&usg=AOvVaw0GZyFGR88kDmGyacUU0opq)\r\n - [OpenSSF Slack](https://www.google.com/url?q=https%3A%2F%2Fslack.openssf.org%2F&sa=D&ust=1649124843387000&usg=AOvVaw2o1O1sSQ15LmhM4jfZyZ0D)\r\n - [Metric Dashboard](https://www.google.com/url?q=https%3A%2F%2Fmetrics.openssf.org%2F&sa=D&ust=1649124843387000&usg=AOvVaw2iZFHEQevKr-uG2kHT5kow)\r\n- TODO\r\n - @pdxjohnny\r\n - [ ] Reach out to Christine about metrics collaboration\r\n - [ ] Respond with slides for Mike if he asks",
"editedAt": "2022-07-22T18:39:36Z"
},
{
"diff": "# 2022-07-20 Identifying Security Threats WG\r\n\r\n- Mike leading\r\n- Marta\r\n - Office hours\r\n - Place for Open Source maintainers to be able to ask community of security experts\r\n - Idea is to run first two sessions in August and September\r\n - Proposing two different timeslots to cover all geos\r\n - Question is will be be well staffed for both of those\r\n - She is collecting feedback right now on possibilities for those dates\r\n - Pinging folks who have shown interest in the past\r\n - What format?\r\n - People just show up and ask\r\n - Registration with topic they want to talk about\r\n - Allows us to prepare, consensus is currently we like this\r\n - Can grab right experts beforehand this way\r\n - Reaching out to logistics team for how we can communicate\r\n zoom links, etc.\r\n - Will test registration beginning of August\r\n - Will do doodle poll or something for slots\r\n - Jen is the one for the Zoom setup\r\n - Amir from ostif.org volunteering to answer questions\r\n - May want to do a blog in addition to twitter\r\n - Outreach maybe 4th or 5th, have the twitter points back\r\n to the blog to capture that.\r\n- Meeting time update\r\n - We have been doing this at this time for about a year or so\r\n - We previously alternated between two timeslots for Europe and Asia\r\n - Should we keep this 10 AM Pacific timeslot?\r\n - Alternate between US and APAC friendly timezone\r\n - Most other WGs are morning Pacific time\r\n- Technical Advisory Committee (TAC) update\r\n - They are tasked with making sure we are delivering on our\r\n cohesive promise, part of that is visuabliity and transparency\r\n into the work that we do.\r\n - We now have a formal reporting process\r\n - It's not a periodic we're all invited to show up to the TAC meeting\r\n one slide per project.\r\n - What we're doing\r\n - Why we're doing it\r\n - It's meant as an FYI, we are not asking for approval, we're letting them\r\n know what we're up to.\r\n - Everyone who is driving a process or project or thing, please send Mike\r\n a single slide, what is it, why are we doing it, what the status is,\r\n what's coming next, and if you need anything\r\n - Christine on metrics\r\n - Luigi for SECURITY-INSIGHTS.yml`\r\n - Mike will send out a template\r\n - Please fill and respond by Monday\r\n - Mike says the metrics work should live under a working group, maybe this one, maybe best practices\r\n - CRob might have an opinion here, as long as work gets done\r\n - As an org OpenSSF would benefit by being less siloed\r\n - Question on if we should align to streams?\r\n - LFX specific definition of metrics in mobilization paper\r\n - AR to sync with CRob and see what he thinks.\r\n - Will raise with TAC next week.\r\n- A few action items for metrics from Christine\r\n - Working groups are adopting streams from the mobilization plans\r\n- Mike: Alpha Omega\r\n - A few people were on the public call earlier\r\n - The recording will be on YouTube\r\n - Mike will give the fast version of the presentation right now\r\n - They are still hiring\r\n - Exploring ways of allocating headcount other than direct hiring\r\n - If you know anyone or are interested please apply or ping them!\r\n - Alpha\r\n - Announced Node, Python, Eclipse\r\n - Omega\r\n - Toolchain is pending\r\n - Waiting for legal approval due to the way the license for CodeQL works\r\n - Had a CVE in Node that got fixed earlier this month\r\n - RCE in JSHint that was bitrotted (unused) we removed\r\n - Two CVEs discloudsed yetserday and two more in the works (couple weeks to release ETA)\r\n - Found NodeJS vuln via systemcall tracing\r\n - It tires to query `openssl.cnf` and dumps strace logs to a repo\r\n - You then have a one stop show of show me every link package of when a binary starts, it does a DNS query\r\n - John: Sounds aligned with Alice's goals\r\n - https://sos.dev coming under Alpha-Omega\r\n - Allows us to compensate dev directly\r\n - How to participate\r\n - Improve security tools\r\n - https://sos.dev\r\n - Join working groups\r\n - Get on slack\r\n- Amir: Security Reviews\r\n - Repo is looking good\r\n - Updating with four new audits that ostif.org published last week\r\n - At almost 100 reviews from Mike (Omega work), ostif.org, and community\r\n - We're gaining traction, getting good stuff in there all the time\r\n - Might need some help with the automated testing that get's done\r\n when we upload reviews.\r\n - Feedback always welcome.\r\n- John: Collection of metric / Alpha-Omega data into shared DB\r\n - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice\r\n - https://datatracker.ietf.org/doc/html/draft-birkholz-scitt-architecture\r\n - https://www.w3.org/2022/07/pressrelease-did-rec.html.en\r\n - https://docs.microsoft.com/en-us/azure/confidential-ledger/architecture\r\n - Mike\r\n - Mike has been thinking about SCITT as a schema and rules on how one would assert facts, weither it's confidential compute or traditional permissions is impelmenetation details.\r\n - If metircs runs across you're repo and you have 30 contributors, great\r\n - As consumer, how can I discover that fact and trust that it's accruate\r\n - Could immaiget a world where things like Scorecard express the data as as SCITT assursion\r\n - You go and query that store and you say tell me everythig you know about foo and you get it all back\r\n - Until we have an implementation with WEb5 that's at at least beta, we could expore what that looks like.\r\n - John: We can do rekor for now, we'll bridge it all later target 1-2 years out\r\n - We have alignment. Time to execute. rekor + sigstore for metric data atteststation signed with github odic tokens. We care about data provenance. We will later bridge into web5 space used as central points of comms given DID as effectively the URL or the future. This is in realtion to what we talked to Melvin about with data provenance. We need to start planning how we are going to build up this space now so we can have provenance on thoughts later. This provenance could be for example on inference derived from provenance from training data and model training env and config. This will allow us to ensure the prioritizer make decisions based on Sprit of the law / aka intent based policy derived from Trinity of Static Analysis, Dynamic Analysis, and Human Intent.\r\n - Living Threat Model threats, mitigations, trust boundaries as initial data set for cross domain conceptual mapping of the the trinity to build pyramid of thought alignment to strategic principles.\r\n - One of our strategic plans / principles says: \"We must be able to trust the sources of all input data used for all model training was done from research studies with these ethical certifications\"\r\n - This allows us to write policies (Open Policy Agent to JSON to DID/VC/SCITT translation/application exploration still in progress) for the organizations we form and apply them as overlays to flows we execute where context appropriate. These overlaid flows define the trusted parties within that context as applicable to the active organizational policies as applicable to the top level system context.\r\n - The policy associated with the principle that consumes the overlaid trust attestations we could have an auditor for which checks the SCITT provenance information associated with the operation implementations and the operation implementation network, input network, etc. within the orchestrators trust boundary (TODO need to track usages / `reuse` of contexts `ictx`, `nctx`, etc. with something predeclared, aka at runtime if your `Operation` data structure doesn't allowlist your usage of it you can pass it to a subflow for reuse. This allows us to use the format within our orchrestration and for static analysis because we can use this same format to describe the trust boundry proeprties that other domain sepcific represenatations of architecture have, for instance we could if we were doing and Open Architecture (OA) Intermediate Representation (IR) for and ELF file we might note that the input network context is not reused from the top level system context. Where as if we did an OA IR for Python code we would say that the input network is reused from the top level system context (it has access to that memory region, whereas when you launch and ELF you look access to the parents memory region, typically).\r\n - Christine\r\n - Looking at trying to connect all the different data sources\r\n- References\r\n - [Meeting Notes](https://docs.google.com/document/d/1AfI0S6VjBCO0ZkULCYZGHuzzW8TPqO3zYxRjzmKvUB4/edit?usp=sharing)\r\n - [GitHub Workgroup Page](https://www.google.com/url?q=https%3A%2F%2Fgithub.com%2Fossf%2Fwg-identifying-security-threats&sa=D&ust=1649124843387000&usg=AOvVaw0GZyFGR88kDmGyacUU0opq)\r\n - [OpenSSF Slack](https://www.google.com/url?q=https%3A%2F%2Fslack.openssf.org%2F&sa=D&ust=1649124843387000&usg=AOvVaw2o1O1sSQ15LmhM4jfZyZ0D)\r\n - [Metric Dashboard](https://www.google.com/url?q=https%3A%2F%2Fmetrics.openssf.org%2F&sa=D&ust=1649124843387000&usg=AOvVaw2iZFHEQevKr-uG2kHT5kow)\r\n- TODO\r\n - @pdxjohnny\r\n - [ ] Reach out to Christine about metrics collaboration\r\n - [ ] Respond with slides for Mike if he asks",
"editedAt": "2022-07-22T18:37:22Z"
},
{
"diff": "# 2022-07-20 Identifying Security Threats WG\r\n\r\n- Mike leading\r\n- Marta\r\n - Office hours\r\n - Place for Open Source maintainers to be able to ask community of security experts\r\n - Idea is to run first two sessions in August and September\r\n - Proposing two different timeslots to cover all geos\r\n - Question is will be be well staffed for both of those\r\n - She is collecting feedback right now on possibilities for those dates\r\n - Pinging folks who have shown interest in the past\r\n - What format?\r\n - People just show up and ask\r\n - Registration with topic they want to talk about\r\n - Allows us to prepare, consensus is currently we like this\r\n - Can grab right experts beforehand this way\r\n - Reaching out to logistics team for how we can communicate\r\n zoom links, etc.\r\n - Will test registration beginning of August\r\n - Will do doodle poll or something for slots\r\n - Jen is the one for the Zoom setup\r\n - Amir from ostif.org volunteering to answer questions\r\n - May want to do a blog in addition to twitter\r\n - Outreach maybe 4th or 5th, have the twitter points back\r\n to the blog to capture that.\r\n- Meeting time update\r\n - We have been doing this at this time for about a year or so\r\n - We previously alternated between two timeslots for Europe and Asia\r\n - Should we keep this 10 AM Pacific timeslot?\r\n - Alternate between US and APAC friendly timezone\r\n - Most other WGs are morning Pacific time\r\n- Technical Advisory Committee (TAC) update\r\n - They are tasked with making sure we are delivering on our\r\n cohesive promise, part of that is visuabliity and transparency\r\n into the work that we do.\r\n - We now have a formal reporting process\r\n - It's not a periodic we're all invited to show up to the TAC meeting\r\n one slide per project.\r\n - What we're doing\r\n - Why we're doing it\r\n - It's meant as an FYI, we are not asking for approval, we're letting them\r\n know what we're up to.\r\n - Everyone who is driving a process or project or thing, please send Mike\r\n a single slide, what is it, why are we doing it, what the status is,\r\n what's coming next, and if you need anything\r\n - Christine on metrics\r\n - Luigi for SECURITY-INSIGHTS.yml`\r\n - Mike will send out a template\r\n - Please fill and respond by Monday\r\n - Mike says the metrics work should live under a working group, maybe this one, maybe best practices\r\n - CRob might have an opinion here, as long as work gets done\r\n - As an org OpenSSF would benefit by being less siloed\r\n - Question on if we should align to streams?\r\n - LFX specific definition of metrics in mobilization paper\r\n - AR to sync with CRob and see what he thinks.\r\n - Will raise with TAC next week.\r\n- A few action items for metrics from Christine\r\n - Working groups are adopting streams from the mobilization plans\r\n- Mike: Alpha Omega\r\n - A few people were on the public call earlier\r\n - The recording will be on YouTube\r\n - Mike will give the fast version of the presentation right now\r\n - They are still hiring\r\n - Exploring ways of allocating headcount other than direct hiring\r\n - If you know anyone or are interested please apply or ping them!\r\n - Alpha\r\n - Announced Node, Python, Eclipse\r\n - Omega\r\n - Toolchain is pending\r\n - Waiting for legal approval due to the way the license for CodeQL works\r\n - Had a CVE in Node that got fixed earlier this month\r\n - RCE in JSHint that was bitrotted (unused) we removed\r\n - Two CVEs discloudsed yetserday and two more in the works (couple weeks to release ETA)\r\n - Found NodeJS vuln via systemcall tracing\r\n - It tires to query `openssl.cnf` and dumps strace logs to a repo\r\n - You then have a one stop show of show me every link package of when a binary starts, it does a DNS query\r\n - John: Sounds aligned with Alice's goals\r\n - https://sos.dev coming under Alpha-Omega\r\n - Allows us to compensate dev directly\r\n - How to participate\r\n - Improve security tools\r\n - https://sos.dev\r\n - Join working groups\r\n - Get on slack\r\n- Amir: Security Reviews\r\n - Repo is looking good\r\n - Updating with four new audits that ostif.org published last week\r\n - At almost 100 reviews from Mike (Omega work), ostif.org, and community\r\n - We're gaining traction, getting good stuff in there all the time\r\n - Might need some help with the automated testing that get's done\r\n when we upload reviews.\r\n - Feedback always welcome.\r\n- John: Collection of metric / Alpha-Omega data into shared DB\r\n - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice\r\n - https://datatracker.ietf.org/doc/html/draft-birkholz-scitt-architecture\r\n - https://www.w3.org/2022/07/pressrelease-did-rec.html.en\r\n - https://docs.microsoft.com/en-us/azure/confidential-ledger/architecture\r\n - Mike\r\n - Mike has been thinking about SCITT as a schema and rules on how one would assert facts, weither it's confidential compute or traditional permissions is impelmenetation details.\r\n - If metircs runs across you're repo and you have 30 contributors, great\r\n - As consumer, how can I discover that fact and trust that it's accruate\r\n - Could immaiget a world where things like Scorecard express the data as as SCITT assursion\r\n - You go and query that store and you say tell me everythig you know about foo and you get it all back\r\n - Until we have an implementation with WEb5 that's at at least beta, we could expore what that looks like.\r\n - John: We can do rekor for now, we'll bridge it all later target 1-2 years out\r\n - We have alignment. Time to execute. rekor + sigstore for metric data atteststation signed with github odic tokens. We care about data provenance. We will later bridge into web5 space used as central points of comms given DID as effectively the URL or the future. This is in realtion to what we talked to Melvin about with data provenance. We need to start planning how we are going to build up this space now so we can have provenance on thoughts later. This provenance could be for example on inference derived from provenance from training data and model training env and config. This will allow us to ensure prioritizers make descisions based on Sprit of the law / aka intent based policy derived from Trinity of Static Analysis, Dynamic Analysis, and Human Intent.\r\n - Living Threat Model threats, mitigations, trust boundaries as initial data set for cross domain conceptual mapping of the the trinity to build pyramid of thought alignment to strategic principles.\r\n - Christine\r\n - Looking at trying to connect all the different data sources\r\n- References\r\n - [Meeting Notes](https://docs.google.com/document/d/1AfI0S6VjBCO0ZkULCYZGHuzzW8TPqO3zYxRjzmKvUB4/edit?usp=sharing)\r\n - [GitHub Workgroup Page](https://www.google.com/url?q=https%3A%2F%2Fgithub.com%2Fossf%2Fwg-identifying-security-threats&sa=D&ust=1649124843387000&usg=AOvVaw0GZyFGR88kDmGyacUU0opq)\r\n - [OpenSSF Slack](https://www.google.com/url?q=https%3A%2F%2Fslack.openssf.org%2F&sa=D&ust=1649124843387000&usg=AOvVaw2o1O1sSQ15LmhM4jfZyZ0D)\r\n - [Metric Dashboard](https://www.google.com/url?q=https%3A%2F%2Fmetrics.openssf.org%2F&sa=D&ust=1649124843387000&usg=AOvVaw2iZFHEQevKr-uG2kHT5kow)\r\n- TODO\r\n - @pdxjohnny\r\n - [ ] Reach out to Christine about metrics collaboration\r\n - [ ] Respond with slides for Mike if he asks",
"editedAt": "2022-07-22T18:17:02Z"
},
{
"diff": "# 2022-07-20 Identifying Security Threats WG\r\n\r\n- Mike leading\r\n- Marta\r\n - Office hours\r\n - Place for Open Source maintainers to be able to ask community of security experts\r\n - Idea is to run first two sessions in August and September\r\n - Proposing two different timeslots to cover all geos\r\n - Question is will be be well staffed for both of those\r\n - She is collecting feedback right now on possibilities for those dates\r\n - Pinging folks who have shown interest in the past\r\n - What format?\r\n - People just show up and ask\r\n - Registration with topic they want to talk about\r\n - Allows us to prepare, consensus is currently we like this\r\n - Can grab right experts beforehand this way\r\n - Reaching out to logistics team for how we can communicate\r\n zoom links, etc.\r\n - Will test registration beginning of August\r\n - Will do doodle poll or something for slots\r\n - Jen is the one for the Zoom setup\r\n - Amir from ostif.org volunteering to answer questions\r\n - May want to do a blog in addition to twitter\r\n - Outreach maybe 4th or 5th, have the twitter points back\r\n to the blog to capture that.\r\n- Meeting time update\r\n - We have been doing this at this time for about a year or so\r\n - We previously alternated between two timeslots for Europe and Asia\r\n - Should we keep this 10 AM Pacific timeslot?\r\n - Alternate between US and APAC friendly timezone\r\n - Most other WGs are morning Pacific time\r\n- Technical Advisory Committee (TAC) update\r\n - They are tasked with making sure we are delivering on our\r\n cohesive promise, part of that is visuabliity and transparency\r\n into the work that we do.\r\n - We now have a formal reporting process\r\n - It's not a periodic we're all invited to show up to the TAC meeting\r\n one slide per project.\r\n - What we're doing\r\n - Why we're doing it\r\n - It's meant as an FYI, we are not asking for approval, we're letting them\r\n know what we're up to.\r\n - Everyone who is driving a process or project or thing, please send Mike\r\n a single slide, what is it, why are we doing it, what the status is,\r\n what's coming next, and if you need anything\r\n - Christine on metrics\r\n - Luigi for SECURITY-INSIGHTS.yml`\r\n - Mike will send out a template\r\n - Please fill and respond by Monday\r\n - Mike says the metrics work should live under a working group, maybe this one, maybe best practices\r\n - CRob might have an opinion here, as long as work gets done\r\n - As an org OpenSSF would benefit by being less siloed\r\n - Question on if we should align to streams?\r\n - LFX specific definition of metrics in mobilization paper\r\n - AR to sync with CRob and see what he thinks.\r\n - Will raise with TAC next week.\r\n- A few action items for metrics from Christine\r\n - Working groups are adopting streams from the mobilization plans\r\n- Mike: Alpha Omega\r\n - A few people were on the public call earlier\r\n - The recording will be on YouTube\r\n - Mike will give the fast version of the presentation right now\r\n - They are still hiring\r\n - Exploring ways of allocating headcount other than direct hiring\r\n - If you know anyone or are interested please apply or ping them!\r\n - Alpha\r\n - Announced Node, Python, Eclipse\r\n - Omega\r\n - Toolchain is pending\r\n - Waiting for legal approval due to the way the license for CodeQL works\r\n - Had a CVE in Node that got fixed earlier this month\r\n - RCE in JSHint that was bitrotted (unused) we removed\r\n - Two CVEs discloudsed yetserday and two more in the works (couple weeks to release ETA)\r\n - Found NodeJS vuln via systemcall tracing\r\n - It tires to query `openssl.cnf` and dumps strace logs to a repo\r\n - You then have a one stop show of show me every link package of when a binary starts, it does a DNS query\r\n - John: Sounds aligned with Alice's goals\r\n - https://sos.dev coming under Alpha-Omega\r\n - Allows us to compensate dev directly\r\n - How to participate\r\n - Improve security tools\r\n - https://sos.dev\r\n - Join working groups\r\n - Get on slack\r\n- Amir: Security Reviews\r\n - Repo is looking good\r\n - Updating with four new audits that ostif.org published last week\r\n - At almost 100 reviews from Mike (Omega work), ostif.org, and community\r\n - We're gaining traction, getting good stuff in there all the time\r\n - Might need some help with the automated testing that get's done\r\n when we upload reviews.\r\n - Feedback always welcome.\r\n- John: Collection of metric / Alpha-Omega data into shared DB\r\n - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice\r\n - https://datatracker.ietf.org/doc/html/draft-birkholz-scitt-architecture\r\n - https://www.w3.org/2022/07/pressrelease-did-rec.html.en\r\n - https://docs.microsoft.com/en-us/azure/confidential-ledger/architecture\r\n - Mike\r\n - Mike has been thinking about SCITT as a schema and rules on how one would assert facts, weither it's confidential compute or traditional permissions is impelmenetation details.\r\n - If metircs runs across you're repo and you have 30 contributors, great\r\n - As consumer, how can I discover that fact and trust that it's accruate\r\n - Could immaiget a world where things like Scorecard express the data as as SCITT assursion\r\n - You go and query that store and you say tell me everythig you know about foo and you get it all back\r\n - Until we have an implementation with WEb5 that's at at least beta, we could expore what that looks like.\r\n - John: We can do rekor for now, we'll bridge it all later\r\n - John: Sounds like we're aligned\r\n - Christine\r\n - Looking at trying to connect all the different data sources\r\n- References\r\n - [Meeting Notes](https://docs.google.com/document/d/1AfI0S6VjBCO0ZkULCYZGHuzzW8TPqO3zYxRjzmKvUB4/edit?usp=sharing)\r\n - [GitHub Workgroup Page](https://www.google.com/url?q=https%3A%2F%2Fgithub.com%2Fossf%2Fwg-identifying-security-threats&sa=D&ust=1649124843387000&usg=AOvVaw0GZyFGR88kDmGyacUU0opq)\r\n - [OpenSSF Slack](https://www.google.com/url?q=https%3A%2F%2Fslack.openssf.org%2F&sa=D&ust=1649124843387000&usg=AOvVaw2o1O1sSQ15LmhM4jfZyZ0D)\r\n - [Metric Dashboard](https://www.google.com/url?q=https%3A%2F%2Fmetrics.openssf.org%2F&sa=D&ust=1649124843387000&usg=AOvVaw2iZFHEQevKr-uG2kHT5kow)\r\n- TODO\r\n - @pdxjohnny\r\n - [ ] Reach out to Christine about metrics collaboration\r\n - [ ] Respond with slides for Mike if he asks",
"editedAt": "2022-07-20T17:46:10Z"
},
{
"diff": "# 2022-07-20 Identifying Security Threats WG\r\n\r\n- Mike leading\r\n- Marta\r\n - Office hours\r\n - Place for Open Source maintainers to be able to ask community of security experts\r\n - Idea is to run first two sessions in August and September\r\n - Proposing two different timeslots to cover all geos\r\n - Question is will be be well staffed for both of those\r\n - She is collecting feedback right now on possibilities for those dates\r\n - Pinging folks who have shown interest in the past\r\n - What format?\r\n - People just show up and ask\r\n - Registration with topic they want to talk about\r\n - Allows us to prepare, consensus is currently we like this\r\n - Can grab right experts beforehand this way\r\n - Reaching out to logistics team for how we can communicate\r\n zoom links, etc.\r\n - Will test registration beginning of August\r\n - Will do doodle poll or something for slots\r\n - Jen is the one for the Zoom setup\r\n - Amir from ostif.org volunteering to answer questions\r\n - May want to do a blog in addition to twitter\r\n - Outreach maybe 4th or 5th, have the twitter points back\r\n to the blog to capture that.\r\n- Meeting time update\r\n - We have been doing this at this time for about a year or so\r\n - We previously alternated between two timeslots for Europe and Asia\r\n - Should we keep this 10 AM Pacific timeslot?\r\n - Alternate between US and APAC friendly timezone\r\n - Most other WGs are morning Pacific time\r\n- Technical Advisory Committee (TAC) update\r\n - They are tasked with making sure we are delivering on our\r\n cohesive promise, part of that is visuabliity and transparency\r\n into the work that we do.\r\n - We now have a formal reporting process\r\n - It's not a periodic we're all invited to show up to the TAC meeting\r\n one slide per project.\r\n - What we're doing\r\n - Why we're doing it\r\n - It's meant as an FYI, we are not asking for approval, we're letting them\r\n know what we're up to.\r\n - Everyone who is driving a process or project or thing, please send Mike\r\n a single slide, what is it, why are we doing it, what the status is,\r\n what's coming next, and if you need anything\r\n - Christine on metrics\r\n - Luigi for SECURITY-INSIGHTS.yml`\r\n - Mike will send out a template\r\n - Mike says the metrics work should live under a working group, maybe this one, maybe best practices\r\n - CRob might have an opinion here, as long as work gets done\r\n - As an org OpenSSF would benefit by being less siloed\r\n - Question on if we should align to streams?\r\n - LFX specific definition of metrics in mobilization paper\r\n - AR to sync with CRob and see what he thinks.\r\n - Will raise with TAC next week.\r\n- A few action items for metrics from Christine\r\n - Working groups are adopting streams from the mobilization plans\r\n- Mike: Alpha Omega\r\n - A few people were on the public call earlier\r\n - The recording will be on YouTube\r\n - Mike will give the fast version of the presentation right now\r\n - They are still hiring\r\n - Exploring ways of allocating headcount other than direct hiring\r\n - If you know anyone or are interested please apply or ping them!\r\n - Alpha\r\n - Annouced Node, Python, Eclipse\r\n - Omega\r\n - Toolchain is pending\r\n - Waiting for legal approval due to the way the license for CodeQL works\r\n - Had a CVE in Node that got fixed earlier this month\r\n - RCE in JSHint that was bitrotted (unused) we removed\r\n - Two CVEs discloudsed yetserday and two more in the works (couple weeks to release ETA)\r\n - Found NodeJS vuln via systemcall tracing\r\n - It tires to query `openssl.cnf` and dumps strace logs to a repo\r\n - You then have a one stop show of show me every link package of when a binary starts, it does a DNS query\r\n - Sounds kinda like Alice's goals\r\n - https://sos.dev coming under Alpha-Omega\r\n - Allows us to compensate dev directly\r\n - How to participate\r\n - Improve security tools\r\n - https://sos.dev\r\n - Join working groups\r\n - Get on slack\r\n- Amir: Security Reviews\r\n - Repo is looking good\r\n - Updating with four new audits that ostif.org published last week\r\n - At almost 100 reviews from Mike (Omega work), ostif.org, and community\r\n - We're gaining traction, getting good stuff in there all the time\r\n - Might need some help with the automated testing that get's done\r\n when we upload reviews.\r\n - Feedback always welcome.\r\n- John: Collection of metric / Alpha-Omega data into shared DB\r\n - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice\r\n - https://datatracker.ietf.org/doc/html/draft-birkholz-scitt-architecture\r\n - https://www.w3.org/2022/07/pressrelease-did-rec.html.en\r\n - https://docs.microsoft.com/en-us/azure/confidential-ledger/architecture\r\n - Mike\r\n - Mike has been thinking about SCITT as a schema and rules on how one would assert facts, weither it's confidential compute or traditional permissions is impelmenetation details.\r\n - If metircs runs across you're repo and you have 30 contributors, great\r\n - As consumer, how can I discover that fact and trust that it's accruate\r\n - Could immaiget a world where things like Scorecard express the data as as SCITT assursion\r\n - You go and query that store and you say tell me everythig you know about foo and you get it all back\r\n - Until we have an implementation with WEb5 that's at at least beta, we could expore what that looks like.\r\n - John: We can do rekor for now, we'll bridge it all later\r\n - John: Sounds like we're aligned\r\n - Christine\r\n - Looking at trying to connect all the different data sources\r\n- TODO\r\n - @pdxjohnny\r\n - [ ] Reach out to Christine about metrics collaboration",
"editedAt": "2022-07-20T17:44:03Z"
},
{
"diff": "# 2022-07-20 Identifying Security Threats WG\r\n\r\n- Mike leading\r\n- Marta\r\n - Office hours\r\n - Place for Open Source maintainers to be able to ask community of security experts\r\n - Idea is to run first two sessions in August and September\r\n - Proposing two different timeslots to cover all geos\r\n - Question is will be be well staffed for both of those\r\n - She is collecting feedback right now on possibilities for those dates\r\n - Pinging folks who have shown interest in the past\r\n - What format?\r\n - People just show up and ask\r\n - Registration with topic they want to talk about\r\n - Allows us to prepare, consensus is currently we like this\r\n - Can grab right experts beforehand this way\r\n - Reaching out to logistics team for how we can communicate\r\n zoom links, etc.\r\n - Will test registration beginning of August\r\n - Will do doodle poll or something for slots\r\n - Jen is the one for the Zoom setup\r\n - Amir from ostif.org volunteering to answer questions\r\n - May want to do a blog in addition to twitter\r\n - Outreach maybe 4th or 5th, have the twitter points back\r\n to the blog to capture that.\r\n- Meeting time update\r\n - We have been doing this at this time for about a year or so\r\n - We previously alternated between two timeslots for Europe and Asia\r\n - Should we keep this 10 AM Pacific timeslot?\r\n - Alternate between US and APAC friendly timezone\r\n - Most other WGs are morning Pacific time\r\n- Technical Advisory Committee (TAC) update\r\n - They are tasked with making sure we are delivering on our\r\n cohesive promise, part of that is visuabliity and transparency\r\n into the work that we do.\r\n - We now have a formal reporting process\r\n - It's not a periodic we're all invited to show up to the TAC meeting\r\n one slide per project.\r\n - What we're doing\r\n - Why we're doing it\r\n - It's meant as an FYI, we are not asking for approval, we're letting them\r\n know what we're up to.\r\n - Everyone who is driving a process or project or thing, please send Mike\r\n a single slide, what is it, why are we doing it, what the status is,\r\n what's coming next, and if you need anything\r\n - Christine on metrics\r\n - Luigi for SECURITY-INSIGHTS.yml`\r\n - Mike will send out a template\r\n - Mike says the metrics work should live under a working group, maybe this one, maybe best practices\r\n - CRob might have an opinion here, as long as work gets done\r\n - As an org OpenSSF would benefit by being less siloed\r\n - Question on if we should align to streams?\r\n - LFX specific definition of metrics in mobilization paper\r\n - AR to sync with CRob and see what he thinks.\r\n - Will raise with TAC next week.\r\n- A few action items for metrics from Christine\r\n - Working groups are adopting streams from the mobilization plans\r\n- Mike: Alpha Omega\r\n - A few people were on the public call earlier\r\n - The recording will be on YouTube\r\n - Mike will give the fast version of the presentation right now\r\n - They are still hiring\r\n - Exploring ways of allocating headcount other than direct hiring\r\n - If you know anyone or are interested please apply or ping them!\r\n - Alpha\r\n - Annouced Node, Python, Eclipse\r\n - Omega\r\n - Toolchain is pending\r\n - Waiting for legal approval due to the way the license for CodeQL works\r\n - Had a CVE in Node that got fixed earlier this month\r\n - RCE in JSHint that was bitrotted (unused) we removed\r\n - Two CVEs discloudsed yetserday and two more in the works (couple weeks to release ETA)\r\n - Found NodeJS vuln via systemcall tracing\r\n - It tires to query `openssl.cnf` and dumps strace logs to a repo\r\n - You then have a one stop show of show me every link package of when a binary starts, it does a DNS query\r\n - Sounds kinda like Alice's goals\r\n - https://sos.dev coming under Alpha-Omega\r\n - Allows us to compensate dev directly\r\n - How to participate\r\n - Improve security tools\r\n - https://sos.dev\r\n - Join working groups\r\n - Get on slack\r\n- Amir: Security Reviews\r\n - Repo is looking good\r\n - Updating with four new audits that ostif.org published last week\r\n - At almost 100 reviews from Mike (Omega work), ostif.org, and community\r\n - We're gaining traction, getting good stuff in there all the time\r\n - Might need some help with the automated testing that get's done\r\n when we upload reviews.\r\n - Feedback always welcome.\r\n- John: Collection of metric / Alpha-Omega data into shared DB\r\n - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice\r\n - https://datatracker.ietf.org/doc/html/draft-birkholz-scitt-architecture\r\n - https://www.w3.org/2022/07/pressrelease-did-rec.html.en\r\n - https://docs.microsoft.com/en-us/azure/confidential-ledger/architecture\r\n - Mike\r\n - Mike has been thinking about SCITT as a schema and rules on how one would assert facts, weither it's confidential compute or traditional permissions is impelmenetation details.\r\n - If metircs runs across you're repo and you have 30 contributors, great\r\n - As consumer, how can I discover that fact and trust that it's accruate\r\n - Could immaiget a world where things like Scorecard express the data as as SCITT assursion\r\n - You go and query that store and you say tell me everythig you know about foo and you get it all back\r\n - Until we have an implementation with WEb5 that's at at least beta, we could expore what that looks like.\r\n - We can do rekor for now, we'll bridge it all later\r\n - Christine\r\n - Looking at trying to connect all the different data sources\r\n- TODO\r\n - @pdxjohnny\r\n - [ ] Reach out to Christine about metrics collaboration",
"editedAt": "2022-07-20T17:43:43Z"
},
{
"diff": "# 2022-07-20 Identifying Security Threats WG\r\n\r\n- Mike leading\r\n- Marta\r\n - Office hours\r\n - Place for Open Source maintainers to be able to ask community of security experts\r\n - Idea is to run first two sessions in August and September\r\n - Proposing two different timeslots to cover all geos\r\n - Question is will be be well staffed for both of those\r\n - She is collecting feedback right now on possibilities for those dates\r\n - Pinging folks who have shown interest in the past\r\n - What format?\r\n - People just show up and ask\r\n - Registration with topic they want to talk about\r\n - Allows us to prepare, consensus is currently we like this\r\n - Can grab right experts beforehand this way\r\n - Reaching out to logistics team for how we can communicate\r\n zoom links, etc.\r\n - Will test registration beginning of August\r\n - Will do doodle poll or something for slots\r\n - Jen is the one for the Zoom setup\r\n - Amir from ostif.org volunteering to answer questions\r\n - May want to do a blog in addition to twitter\r\n - Outreach maybe 4th or 5th, have the twitter points back\r\n to the blog to capture that.\r\n- Meeting time update\r\n - We have been doing this at this time for about a year or so\r\n - We previously alternated between two timeslots for Europe and Asia\r\n - Should we keep this 10 AM Pacific timeslot?\r\n - Alternate between US and APAC friendly timezone\r\n - Most other WGs are morning Pacific time\r\n- Technical Advisory Committee (TAC) update\r\n - They are tasked with making sure we are delivering on our\r\n cohesive promise, part of that is visuabliity and transparency\r\n into the work that we do.\r\n - We now have a formal reporting process\r\n - It's not a periodic we're all invited to show up to the TAC meeting\r\n one slide per project.\r\n - What we're doing\r\n - Why we're doing it\r\n - It's meant as an FYI, we are not asking for approval, we're letting them\r\n know what we're up to.\r\n - Everyone who is driving a process or project or thing, please send Mike\r\n a single slide, what is it, why are we doing it, what the status is,\r\n what's coming next, and if you need anything\r\n - Christine on metrics\r\n - Luigi for SECURITY-INSIGHTS.yml`\r\n - Mike will send out a template\r\n - Mike says the metrics work should live under a working group, maybe this one, maybe best practices\r\n - CRob might have an opinion here, as long as work gets done\r\n - As an org OpenSSF would benefit by being less siloed\r\n - Question on if we should align to streams?\r\n - LFX specific definition of metrics in mobilization paper\r\n - AR to sync with CRob and see what he thinks.\r\n - Will raise with TAC next week.\r\n- A few action items for metrics from Christine\r\n - Working groups are adopting streams from the mobilization plans\r\n- Mike: Alpha Omega\r\n - A few people were on the public call earlier\r\n - The recording will be on YouTube\r\n - Mike will give the fast version of the presentation right now\r\n - They are still hiring\r\n - Exploring ways of allocating headcount other than direct hiring\r\n - If you know anyone or are interested please apply or ping them!\r\n - Alpha\r\n - Annouced Node, Python, Eclipse\r\n - Omega\r\n - Toolchain is pending\r\n - Waiting for legal approval due to the way the license for CodeQL works\r\n - Had a CVE in Node that got fixed earlier this month\r\n - RCE in JSHint that was bitrotted (unused) we removed\r\n - Two CVEs discloudsed yetserday and two more in the works (couple weeks to release ETA)\r\n - Found NodeJS vuln via systemcall tracing\r\n - It tires to query `openssl.cnf` and dumps strace logs to a repo\r\n - You then have a one stop show of show me every link package of when a binary starts, it does a DNS query\r\n - Sounds kinda like Alice's goals\r\n - https://sos.dev coming under Alpha-Omega\r\n - Allows us to compensate dev directly\r\n - How to participate\r\n - Improve security tools\r\n - https://sos.dev\r\n - Join working groups\r\n - Get on slack\r\n- Amir: Security Reviews\r\n - Repo is looking good\r\n - Updating with four new audits that ostif.org published last week\r\n - At almost 100 reviews from Mike (Omega work), ostif.org, and community\r\n - We're gaining traction, getting good stuff in there all the time\r\n - Might need some help with the automated testing that get's done\r\n when we upload reviews.\r\n - Feedback always welcome.\r\n- John: Collection of metric / Alpha-Omega data into shared DB\r\n - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice\r\n - https://datatracker.ietf.org/doc/html/draft-birkholz-scitt-architecture\r\n - https://www.w3.org/2022/07/pressrelease-did-rec.html.en\r\n - https://docs.microsoft.com/en-us/azure/confidential-ledger/architecture\r\n- TODO\r\n - @pdxjohnny\r\n - [ ] Reach out to Christine about metrics collaboration",
"editedAt": "2022-07-20T17:34:21Z"
},
{
"diff": "# 2022-07-20 Identifying Security Threats WG\r\n\r\n- Mike leading\r\n- Marta\r\n - Office hours\r\n - Place for Open Source maintainers to be able to ask community of security experts\r\n - Idea is to run first two sessions in August and September\r\n - Proposing two different timeslots to cover all geos\r\n - Question is will be be well staffed for both of those\r\n - She is collecting feedback right now on possibilities for those dates\r\n - Pinging folks who have shown interest in the past\r\n - What format?\r\n - People just show up and ask\r\n - Registration with topic they want to talk about\r\n - Allows us to prepare, consensus is currently we like this\r\n - Can grab right experts beforehand this way\r\n - Reaching out to logistics team for how we can communicate\r\n zoom links, etc.\r\n - Will test registration beginning of August\r\n - Will do doodle poll or something for slots\r\n - Jen is the one for the Zoom setup\r\n - Amir from ostif.org volunteering to answer questions\r\n - May want to do a blog in addition to twitter\r\n - Outreach maybe 4th or 5th, have the twitter points back\r\n to the blog to capture that.\r\n- Meeting time update\r\n - We have been doing this at this time for about a year or so\r\n - We previously alternated between two timeslots for Europe and Asia\r\n - Should we keep this 10 AM Pacific timeslot?\r\n - Alternate between US and APAC friendly timezone\r\n - Most other WGs are morning Pacific time\r\n- Technical Advisory Committee (TAC) update\r\n - They are tasked with making sure we are delivering on our\r\n cohesive promise, part of that is visuabliity and transparency\r\n into the work that we do.\r\n - We now have a formal reporting process\r\n - It's not a periodic we're all invited to show up to the TAC meeting\r\n one slide per project.\r\n - What we're doing\r\n - Why we're doing it\r\n - It's meant as an FYI, we are not asking for approval, we're letting them\r\n know what we're up to.\r\n - Everyone who is driving a process or project or thing, please send Mike\r\n a single slide, what is it, why are we doing it, what the status is,\r\n what's coming next, and if you need anything\r\n - Christine on metrics\r\n - Luigi for SECURITY-INSIGHTS.yml`\r\n - Mike will send out a template\r\n - Mike says the metrics work should live under a working group, maybe this one, maybe best practices\r\n - CRob might have an opinion here, as long as work gets done\r\n - As an org OpenSSF would benefit by being less siloed\r\n - Question on if we should align to streams?\r\n - LFX specific definition of metrics in mobilization paper\r\n - AR to sync with CRob and see what he thinks.\r\n - Will raise with TAC next week.\r\n- A few action items for metrics from Christine\r\n - Working groups are adopting streams from the mobilization plans\r\n- Alpha Omega\r\n - A few people were on the public call earlier\r\n - The recording will be on YouTube\r\n - Mike will give the fast version of the presentation right now\r\n - They are still hiring\r\n - Exploring ways of allocating headcount other than direct hiring\r\n - If you know anyone or are interested please apply or ping them!\r\n - Alpha\r\n - Annouced Node, Python, Eclipse\r\n - Omega\r\n - Toolchain is pending\r\n - Waiting for legal approval due to the way the license for CodeQL works\r\n - Had a CVE in Node that got fixed earlier this month\r\n - RCE in JSHint that was bitrotted (unused) we removed\r\n - Two CVEs discloudsed yetserday and two more in the works (couple weeks to release ETA)\r\n - Found NodeJS vuln via systemcall tracing\r\n - It tires to query `openssl.cnf` and dumps strace logs to a repo\r\n - You then have a one stop show of show me every link package of when a binary starts, it does a DNS query\r\n - Sounds kinda like Alice's goals\r\n - https://sos.dev coming under Alpha-Omega\r\n - Allows us to compensate dev directly\r\n - How to participate\r\n - Improve security tools\r\n - https://sos.dev\r\n - Join working groups\r\n - Get on slack\r\n- Collection of metric / Alpha-Omega data into shared DB\r\n - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice\r\n - https://datatracker.ietf.org/doc/html/draft-birkholz-scitt-architecture\r\n - https://www.w3.org/2022/07/pressrelease-did-rec.html.en\r\n - https://docs.microsoft.com/en-us/azure/confidential-ledger/architecture\r\n- TODO\r\n - @pdxjohnny\r\n - [ ] Reach out to Christine about metrics collaboration",
"editedAt": "2022-07-20T17:32:36Z"
},
{
"diff": "# 2022-07-20 Identifying Security Threats WG\r\n\r\n- Mike leading\r\n- Marta\r\n - Office hours\r\n - Place for Open Source maintainers to be able to ask community of security experts\r\n - Idea is to run first two sessions in August and September\r\n - Proposing two different timeslots to cover all geos\r\n - Question is will be be well staffed for both of those\r\n - She is collecting feedback right now on possibilities for those dates\r\n - Pinging folks who have shown interest in the past\r\n - What format?\r\n - People just show up and ask\r\n - Registration with topic they want to talk about\r\n - Allows us to prepare, consensus is currently we like this\r\n - Can grab right experts beforehand this way\r\n - Reaching out to logistics team for how we can communicate\r\n zoom links, etc.\r\n - Will test registration beginning of August\r\n - Will do doodle poll or something for slots\r\n - Jen is the one for the Zoom setup\r\n - Amir from ostif.org volunteering to answer questions\r\n - May want to do a blog in addition to twitter\r\n - Outreach maybe 4th or 5th, have the twitter points back\r\n to the blog to capture that.\r\n- Meeting time update\r\n - We have been doing this at this time for about a year or so\r\n - We previously alternated between two timeslots for Europe and Asia\r\n - Should we keep this 10 AM Pacific timeslot?\r\n - Alternate between US and APAC friendly timezone\r\n - Most other WGs are morning Pacific time\r\n- Technical Advisory Committee (TAC) update\r\n - They are tasked with making sure we are delivering on our\r\n cohesive promise, part of that is visuabliity and transparency\r\n into the work that we do.\r\n - We now have a formal reporting process\r\n - It's not a periodic we're all invited to show up to the TAC meeting\r\n one slide per project.\r\n - What we're doing\r\n - Why we're doing it\r\n - It's meant as an FYI, we are not asking for approval, we're letting them\r\n know what we're up to.\r\n - Everyone who is driving a process or project or thing, please send Mike\r\n a single slide, what is it, why are we doing it, what the status is,\r\n what's coming next, and if you need anything\r\n - Christine on metrics\r\n - Luigi for SECURITY-INSIGHTS.yml`\r\n - Mike will send out a template\r\n - Mike says the metrics work should live under a working group, maybe this one, maybe best practices\r\n - CRob might have an opinion here, as long as work gets done\r\n - As an org OpenSSF would benefit by being less siloed\r\n - Question on if we should align to streams?\r\n - LFX specific definition of metrics in mobilization paper\r\n - AR to sync with CRob and see what he thinks.\r\n - Will raise with TAC next week.\r\n- A few action items for metrics from Christine\r\n - Working groups are adopting streams from the mobilization plans\r\n- Collection of data into shared DB\r\n - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice\r\n - https://datatracker.ietf.org/doc/html/draft-birkholz-scitt-architecture\r\n - https://www.w3.org/2022/07/pressrelease-did-rec.html.en\r\n - https://docs.microsoft.com/en-us/azure/confidential-ledger/architecture\r\n- Alpha Omega\r\n - A few people were on the public call earlier\r\n - The recording will be on YouTube\r\n - Mike will give the fast version of the presentation right now\r\n - They are still hiring\r\n - Exploring ways of allocating headcount other than direct hiring\r\n - If you know anyone or are interested please apply or ping them!\r\n - Alpha\r\n - Annouced Node, Python, Eclipse\r\n - Omega\r\n - Toolchain is pending\r\n - Waiting for legal approval due to the way the license for CodeQL works\r\n - Had a CVE in Node that got fixed earlier this month\r\n - RCE in JSHint that was bitrotted (unused) we removed\r\n - Two CVEs discloudsed yetserday and two more in the works (couple weeks to release ETA)\r\n - Found NodeJS vuln via systemcall tracing\r\n - It tires to query `openssl.cnf` and dumps strace logs to a repo\r\n - You then have a one stop show of show me every link package of when a binary starts, it does a DNS query\r\n - Sounds kinda like Alice's goals\r\n - https://sos.dev coming under Alpha-Omega\r\n - Allows us to compensate dev directly\r\n - How to participate\r\n - Improve security tools\r\n - https://sos.dev\r\n - Join working groups\r\n - Get on slack\r\n- TODO\r\n - @pdxjohnny\r\n - [ ] Reach out to Christine about metrics collaboration",
"editedAt": "2022-07-20T17:32:01Z"
},
{
"diff": "# 2022-07-20 Identifying Security Threats WG\r\n\r\n- Mike leading\r\n- Marta\r\n - Office hours\r\n - Place for Open Source maintainers to be able to ask community of security experts\r\n - Idea is to run first two sessions in August and September\r\n - Proposing two different timeslots to cover all geos\r\n - Question is will be be well staffed for both of those\r\n - She is collecting feedback right now on possibilities for those dates\r\n - Pinging folks who have shown interest in the past\r\n - What format?\r\n - People just show up and ask\r\n - Registration with topic they want to talk about\r\n - Allows us to prepare, consensus is currently we like this\r\n - Can grab right experts beforehand this way\r\n - Reaching out to logistics team for how we can communicate\r\n zoom links, etc.\r\n - Will test registration beginning of August\r\n - Will do doodle poll or something for slots\r\n - Jen is the one for the Zoom setup\r\n - Amir from ostif.org volunteering to answer questions\r\n - May want to do a blog in addition to twitter\r\n - Outreach maybe 4th or 5th, have the twitter points back\r\n to the blog to capture that.\r\n- Meeting time update\r\n - We have been doing this at this time for about a year or so\r\n - We previously alternated between two timeslots for Europe and Asia\r\n - Should we keep this 10 AM Pacific timeslot?\r\n - Alternate between US and APAC friendly timezone\r\n - Most other WGs are morning Pacific time\r\n- Technical Advisory Committee (TAC) update\r\n - They are tasked with making sure we are delivering on our\r\n cohesive promise, part of that is visuabliity and transparency\r\n into the work that we do.\r\n - We now have a formal reporting process\r\n - It's not a periodic we're all invited to show up to the TAC meeting\r\n one slide per project.\r\n - What we're doing\r\n - Why we're doing it\r\n - It's meant as an FYI, we are not asking for approval, we're letting them\r\n know what we're up to.\r\n - Everyone who is driving a process or project or thing, please send Mike\r\n a single slide, what is it, why are we doing it, what the status is,\r\n what's coming next, and if you need anything\r\n - Christine on metrics\r\n - Luigi for SECURITY-INSIGHTS.yml`\r\n - Mike will send out a template\r\n - Mike says the metrics work should live under a working group, maybe this one, maybe best practices\r\n - CRob might have an opinion here, as long as work gets done\r\n - As an org OpenSSF would benefit by being less siloed\r\n - Question on if we should align to streams?\r\n - LFX specific definition of metrics in mobilization paper\r\n - AR to sync with CRob and see what he thinks.\r\n - Will raise with TAC next week.\r\n- A few action items for metrics from Christine\r\n - Working groups are adopting streams from the mobilization plans\r\n- Collection of data into shared DB\r\n - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice\r\n - https://datatracker.ietf.org/doc/html/draft-birkholz-scitt-architecture\r\n - https://www.w3.org/2022/07/pressrelease-did-rec.html.en\r\n - https://docs.microsoft.com/en-us/azure/confidential-ledger/architecture\r\n- Alpha Omega\r\n - A few people were on the public call earlier\r\n - The recording will be on YouTube\r\n - Mike will give the fast version of the presentation right now\r\n - They are still hiring\r\n - Exploring ways of allocating headcount other than direct hiring\r\n - If you know anyone or are interested please apply or ping them!\r\n - Alpha\r\n - Annouced Node, Python, Eclipse\r\n - Omega\r\n - Toolchain is pending\r\n - Waiting for legal approval due to the way the license for CodeQL works\r\n - Had a CVE in Node that got fixed earlier this month\r\n - RCE in JSHint that was bitrotted (unused) we removed\r\n - Two CVEs discloudsed yetserday and two more in the works (couple weeks to release ETA)\r\n - Found NodeJS vuln via systemcall tracing\r\n - It tires to query `openssl.cnf` and dumps strace logs to a repo\r\n - You then have a one stop show of show me every link package of when a binary starts, it does a DNS query\r\n - Sounds kinda like Alice's goals\r\n - https://sos.dev coming under Alpha-Omega\r\n - Allows us to compensate dev directly\r\n - How to participate\r\n - Improve security tools\r\n - sos.dev\r\n - Join working groups\r\n - Get on slack\r\n- TODO\r\n - @pdxjohnny\r\n - [ ] Reach out to Christine about metrics collaboration",
"editedAt": "2022-07-20T17:31:53Z"
},
{
"diff": "# 2022-07-20 Identifying Security Threats WG\r\n\r\n- Mike leading\r\n- Marta\r\n - Office hours\r\n - Place for Open Source maintainers to be able to ask community of security experts\r\n - Idea is to run first two sessions in August and September\r\n - Proposing two different timeslots to cover all geos\r\n - Question is will be be well staffed for both of those\r\n - She is collecting feedback right now on possibilities for those dates\r\n - Pinging folks who have shown interest in the past\r\n - What format?\r\n - People just show up and ask\r\n - Registration with topic they want to talk about\r\n - Allows us to prepare, consensus is currently we like this\r\n - Can grab right experts beforehand this way\r\n - Reaching out to logistics team for how we can communicate\r\n zoom links, etc.\r\n - Will test registration beginning of August\r\n - Will do doodle poll or something for slots\r\n - Jen is the one for the Zoom setup\r\n - Amir from ostif.org volunteering to answer questions\r\n - May want to do a blog in addition to twitter\r\n - Outreach maybe 4th or 5th, have the twitter points back\r\n to the blog to capture that.\r\n- Meeting time update\r\n - We have been doing this at this time for about a year or so\r\n - We previously alternated between two timeslots for Europe and Asia\r\n - Should we keep this 10 AM Pacific timeslot?\r\n - Alternate between US and APAC friendly timezone\r\n - Most other WGs are morning Pacific time\r\n- Technical Advisory Committee (TAC) update\r\n - They are tasked with making sure we are delivering on our\r\n cohesive promise, part of that is visuabliity and transparency\r\n into the work that we do.\r\n - We now have a formal reporting process\r\n - It's not a periodic we're all invited to show up to the TAC meeting\r\n one slide per project.\r\n - What we're doing\r\n - Why we're doing it\r\n - It's meant as an FYI, we are not asking for approval, we're letting them\r\n know what we're up to.\r\n - Everyone who is driving a process or project or thing, please send Mike\r\n a single slide, what is it, why are we doing it, what the status is,\r\n what's coming next, and if you need anything\r\n - Christine on metrics\r\n - Luigi for SECURITY-INSIGHTS.yml`\r\n - Mike will send out a template\r\n - Mike says the metrics work should live under a working group, maybe this one, maybe best practices\r\n - CRob might have an opinion here, as long as work gets done\r\n - As an org OpenSSF would benefit by being less siloed\r\n - Question on if we should align to streams?\r\n - LFX specific definition of metrics in mobilization paper\r\n - AR to sync with CRob and see what he thinks.\r\n - Will raise with TAC next week.\r\n- A few action items for metrics from Christine\r\n - Working groups are adopting streams from the mobilization plans\r\n- Collection of data into shared DB\r\n - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice\r\n - https://datatracker.ietf.org/doc/html/draft-birkholz-scitt-architecture\r\n - https://www.w3.org/2022/07/pressrelease-did-rec.html.en\r\n - https://docs.microsoft.com/en-us/azure/confidential-ledger/architecture\r\n- Alpha Omega\r\n - A few people were on the public call earlier\r\n - The recording will be on YouTube\r\n - Mike will give the fast version of the presentation right now\r\n - \r\n- TODO\r\n - @pdxjohnny\r\n - [ ] Reach out to Christine about metrics collaboration",
"editedAt": "2022-07-20T17:26:16Z"
},
{
"diff": "# 2022-07-20 Identifying Security Threats WG\r\n\r\n- Mike leading\r\n- Marta\r\n - Office hours\r\n - Place for Open Source maintainers to be able to ask community of security experts\r\n - Idea is to run first two sessions in August and September\r\n - Proposing two different timeslots to cover all geos\r\n - Question is will be be well staffed for both of those\r\n - She is collecting feedback right now on possibilities for those dates\r\n - Pinging folks who have shown interest in the past\r\n - What format?\r\n - People just show up and ask\r\n - Registration with topic they want to talk about\r\n - Allows us to prepare, consensus is currently we like this\r\n - Can grab right experts beforehand this way\r\n - Reaching out to logistics team for how we can communicate\r\n zoom links, etc.\r\n - Will test registration beginning of August\r\n - Will do doodle poll or something for slots\r\n - Jen is the one for the Zoom setup\r\n - Amir from ostif.org volunteering to answer questions\r\n - May want to do a blog in addition to twitter\r\n - Outreach maybe 4th or 5th, have the twitter points back\r\n to the blog to capture that.\r\n- Meeting time update\r\n - We have been doing this at this time for about a year or so\r\n - We previously alternated between two timeslots for Europe and Asia\r\n - Should we keep this 10 AM Pacific timeslot?\r\n - Alternate between US and APAC friendly timezone\r\n - Most other WGs are morning Pacific time\r\n- Technical Advisory Committee (TAC) update\r\n - They are tasked with making sure we are delivering on our\r\n cohesive promise, part of that is visuabliity and transparency\r\n into the work that we do.\r\n - We now have a formal reporting process\r\n - It's not a periodic we're all invited to show up to the TAC meeting\r\n one slide per project.\r\n - What we're doing\r\n - Why we're doing it\r\n - It's meant as an FYI, we are not asking for approval, we're letting them\r\n know what we're up to.\r\n - Everyone who is driving a process or project or thing, please send Mike\r\n a single slide, what is it, why are we doing it, what the status is,\r\n what's coming next, and if you need anything\r\n - Christine on metrics\r\n - Luigi for SECURITY-INSIGHTS.yml`\r\n - Mike will send out a template\r\n - Mike says the metrics work should live under a working group, maybe this one, maybe best practices\r\n - CRob might have an opinion here, as long as work gets done\r\n - As an org OpenSSF would benefit by being less siloed\r\n - Question on if we should align to streams?\r\n - LFX specific definition of metrics in mobilization paper\r\n - AR to sync with CRob and see what he thinks.\r\n - Will raise with TAC next week.\r\n- A few action items for metrics from Christine\r\n - Working groups are adopting streams from the mobilization plans\r\n- Collection of data into shared DB\r\n - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice\r\n - https://datatracker.ietf.org/doc/html/draft-birkholz-scitt-architecture\r\n - https://www.w3.org/2022/07/pressrelease-did-rec.html.en\r\n - https://docs.microsoft.com/en-us/azure/confidential-ledger/architecture\r\n- Alpha Omega\r\n - A few people were on the public call earlier\r\n - The recording will be on YouTube\r\n- TODO\r\n - @pdxjohnny\r\n - Reach out to Christine about metrics collaboration\r\n - Watch Alpha Omega recording on YouTube",
"editedAt": "2022-07-20T17:25:49Z"
},
{
"diff": "# 2022-07-20 Identifying Security Threats WG\r\n\r\n- Mike leading\r\n- Marta\r\n - Office hours\r\n - Place for Open Source maintainers to be able to ask community of security experts\r\n - Idea is to run first two sessions in August and September\r\n - Proposing two different timeslots to cover all geos\r\n - Question is will be be well staffed for both of those\r\n - She is collecting feedback right now on possibilities for those dates\r\n - Pinging folks who have shown interest in the past\r\n - What format?\r\n - People just show up and ask\r\n - Registration with topic they want to talk about\r\n - Allows us to prepare, consensus is currently we like this\r\n - Can grab right experts beforehand this way\r\n - Reaching out to logistics team for how we can communicate\r\n zoom links, etc.\r\n - Will test registration beginning of August\r\n - Will do doodle poll or something for slots\r\n - Jen is the one for the Zoom setup\r\n - Amir from ostif.org volunteering to answer questions\r\n - May want to do a blog in addition to twitter\r\n - Outreach maybe 4th or 5th, have the twitter points back\r\n to the blog to capture that.\r\n- Meeting time update\r\n - We have been doing this at this time for about a year or so\r\n - We previously alternated between two timeslots for Europe and Asia\r\n - Should we keep this 10 AM Pacific timeslot?\r\n - Alternate between US and APAC friendly timezone\r\n - Most other WGs are morning Pacific time\r\n- Technical Advisory Committee (TAC) update\r\n - They are tasked with making sure we are delivering on our\r\n cohesive promise, part of that is visuabliity and transparency\r\n into the work that we do.\r\n - We now have a formal reporting process\r\n - It's not a periodic we're all invited to show up to the TAC meeting\r\n one slide per project.\r\n - What we're doing\r\n - Why we're doing it\r\n - It's meant as an FYI, we are not asking for approval, we're letting them\r\n know what we're up to.\r\n - Everyone who is driving a process or project or thing, please send Mike\r\n a single slide, what is it, why are we doing it, what the status is,\r\n what's coming next, and if you need anything\r\n - Christine on metrics\r\n - Luigi for SECURITY-INSIGHTS.yml`\r\n - Mike will send out a template\r\n - Mike says the metrics work should live under a working group, maybe this one, maybe best practices\r\n - CRob might have an opinion here, as long as work gets done\r\n - As an org OpenSSF would benefit by being less siloed\r\n - Question on if we should align to streams?\r\n - LFX specific definition of metrics in mobilization paper\r\n - AR to sync with CRob and see what he thinks.\r\n - Will raise with TAC next week.\r\n- A few action items for metrics from Christine\r\n - Working groups are adopting streams from the mobilization plans\r\n- Collection of data into shared DB\r\n - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice\r\n - https://datatracker.ietf.org/doc/html/draft-birkholz-scitt-architecture\r\n - https://www.w3.org/2022/07/pressrelease-did-rec.html.en\r\n - https://docs.microsoft.com/en-us/azure/confidential-ledger/architecture\r\n- TODO\r\n - @pdxjohnny\r\n - Reach out to Christine about metrics collaboration",
"editedAt": "2022-07-20T17:24:47Z"
},
{
"diff": "# 2022-07-20 Identifying Security Threats WG\r\n\r\n- Mike leading\r\n- Marta\r\n - Office hours\r\n - Place for Open Source maintainers to be able to ask community of security experts\r\n - Idea is to run first two sessions in August and September\r\n - Proposing two different timeslots to cover all geos\r\n - Question is will be be well staffed for both of those\r\n - She is collecting feedback right now on possibilities for those dates\r\n - Pinging folks who have shown interest in the past\r\n - What format?\r\n - People just show up and ask\r\n - Registration with topic they want to talk about\r\n - Allows us to prepare, consensus is currently we like this\r\n - Can grab right experts beforehand this way\r\n - Reaching out to logistics team for how we can communicate\r\n zoom links, etc.\r\n - Will test registration beginning of August\r\n - Will do doodle poll or something for slots\r\n - Jen is the one for the Zoom setup\r\n - Amir from ostif.org volunteering to answer questions\r\n - May want to do a blog in addition to twitter\r\n - Outreach maybe 4th or 5th, have the twitter points back\r\n to the blog to capture that.\r\n- Meeting time update\r\n - We have been doing this at this time for about a year or so\r\n - We previously alternated between two timeslots for Europe and Asia\r\n - Should we keep this 10 AM Pacific timeslot?\r\n - Most other WGs are morning Pacific time\r\n- Collection of data into shared DB\r\n - https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice\r\n - https://datatracker.ietf.org/doc/html/draft-birkholz-scitt-architecture\r\n - https://www.w3.org/2022/07/pressrelease-did-rec.html.en\r\n - https://docs.microsoft.com/en-us/azure/confidential-ledger/architecture",
"editedAt": "2022-07-20T17:15:46Z"
},
{
"diff": "# 2022-07-20 Identifying Security Threats WG\r\n\r\n- Mike leading\r\n- Marta\r\n - Office hours\r\n - Place for Open Source maintainers to be able to ask community of security experts\r\n - Idea is to run first two sessions in August and September\r\n - Proposing two different timeslots to cover all geos\r\n - Question is will be be well staffed for both of those\r\n - She is collecting feedback right now on possibilities for those dates\r\n - Pinging folks who have shown interest in the past\r\n - What format?\r\n - People just show up and ask\r\n - Registration with topic they want to talk about\r\n - Allows us to prepare, consensus is currently we like this\r\n - Can grab right experts beforehand this way\r\n - Reaching out to logistics team for how we can communicate\r\n zoom links, etc.\r\n - Will test registration beginning of August\r\n - Will do doodle poll or something for slots\r\n - Jen is the one for the Zoom setup\r\n - Amir from ostif.org volunteering to answer questions\r\n - May want to do a blog in addition to twitter\r\n - Outreach maybe 4th or 5th, have the twitter points back\r\n to the blog to capture that.\r\n- ",
"editedAt": "2022-07-20T17:13:44Z"
},
{
"diff": "# 2022-07-20 Identifying Security Threats WG\r\n\r\n- Mike leading\r\n- Marta\r\n - Office hours\r\n - Place for Open Source maintainers to be able to ask community of security experts\r\n - Idea is to run first two sessions in August and September\r\n - Proposing two different timeslots to cover all geos\r\n - Question is will be be well staffed for both of those\r\n - She is collecting feedback right now on possibilities for those dates\r\n - Pinging folks who have shown interest in the past\r\n - What format?\r\n - People just show up and ask\r\n - Registration with topic they want to talk about\r\n - Allows us to prepare, consensus is currently we like this\r\n - Can grab right experts beforehand this way\r\n - Reaching out to logistics team for how we can communicate\r\n zoom links, etc.\r\n - Will test registration beginning of August\r\n - Will do doodle poll or something for slots\r\n - Jen is the one for the Zoom setup\r\n- ",
"editedAt": "2022-07-20T17:12:22Z"
},
{
"diff": "# 2022-07-20 Identifying Security Threats WG\r\n\r\n- Mike leading\r\n- Marta\r\n - Office hours\r\n - Place for Open Source maintainers to be able to ask community of security experts\r\n - Idea is to run first two sessions in August and September\r\n - Proposing two different timeslots to cover all geos\r\n - Question is will be be well staffed for both of those\r\n - She is collecting feedback right now on possibilities for those dates\r\n - Pinging folks who have shown interest in the past\r\n - What format?\r\n - People just show up and ask\r\n - Registration with topic they want to talk about\r\n - Allows us to prepare, consensus is currently we like this\r\n - Can grab right experts beforehand this way\r\n - Reaching out to logistics team for how we can communicate\r\n zoom links, etc.\r\n - Will test registration beginning of August\r\n- ",
"editedAt": "2022-07-20T17:11:43Z"
},
{
"diff": "# 2022-07-20 Identifying Security Threats WG\r\n\r\n- Mike leading\r\n- Marta\r\n - Office hours\r\n - Place for Open Source maintainers to be able to ask community of security experts\r\n - Idea is to run first two sessions in August and September\r\n - Proposing two different timeslots to cover all geos\r\n - Question is will be be well staffed for both of those\r\n - She is collecting feedback right now on possibilities for those dates\r\n - Pinging folks who have shown interest in the past\r\n - What format?\r\n - People just show up and ask\r\n - Registration with topic they want to talk about\r\n - Allows us to prepare, consensus is currently we like this\r\n - Can grab right experts beforehand this way\r\n- \r\n",
"editedAt": "2022-07-20T17:10:50Z"
},
{
"diff": "# 2022-07-20 Identifying Security Threats WG\r\n\r\n- Mike leading\r\n- Marta\r\n - Office hours\r\n - Place for Open Source maintainers to be able to ask community of security experts\r\n - Idea is to run first two sessions in August and September\r\n - Proposing two different timeslots to cover all geos\r\n - \r\n- \r\n",
"editedAt": "2022-07-20T17:09:05Z"
}
]
}
}
]
}
},
{
"body": "# 2022-07-21 Engineering Logs\r\n\r\n- https://docs.rs/differential-dataflow/latest/differential_dataflow/\r\n- https://lists.spdx.org/g/Spdx-tech/message/4673\r\n - > It is not just a matter of your software, it is a fundamental design question whether to maintain separation between the logical model and its serializations. Maintaining separation shouldn't be a matter of personal preference, it's good software engineering. The OWL Web Ontology Language https://www.w3.org/TR/owl2-overview/ has an excellent diagram illustrating the separation between semantics and syntax. Several serializations are defined in OWL (Manchester Syntax, Functional Syntax, RDF/XML, OWL/XML, and Turtle), and more syntaxes have been added since (JSON-LD, RDF-star, ...).\r\n- Knock and the gates will be opened. This morning they were. The beauty of the light is blinding.\r\n- Gravity: our top level system context\u2019s reward mechanism for alignment. The universe seeks harmonization. Where else do we go after a big bang? Back to convergence, rinse and repeat. The balance between extremes is found in the aggregate across cycles. Remember, the architecture is the same up and down the stack.\r\n - https://bigthink.com/hard-science/physicist-radical-theory-of-gravity/",
"createdAt": "2022-07-21T22:04:10Z",
"userContentEdits": {
"pageInfo": {
"endCursor": "Y3Vyc29yOnYyOpHOABOCrA==",
"hasNextPage": false
},
"totalCount": 5,
"nodes": [
{
"diff": "# 2022-07-21 Engineering Logs\r\n\r\n- https://docs.rs/differential-dataflow/latest/differential_dataflow/\r\n- https://lists.spdx.org/g/Spdx-tech/message/4673\r\n - > It is not just a matter of your software, it is a fundamental design question whether to maintain separation between the logical model and its serializations. Maintaining separation shouldn't be a matter of personal preference, it's good software engineering. The OWL Web Ontology Language https://www.w3.org/TR/owl2-overview/ has an excellent diagram illustrating the separation between semantics and syntax. Several serializations are defined in OWL (Manchester Syntax, Functional Syntax, RDF/XML, OWL/XML, and Turtle), and more syntaxes have been added since (JSON-LD, RDF-star, ...).\r\n- Knock and the gates will be opened. This morning they were. The beauty of the light is blinding.\r\n- Gravity: our top level system context\u2019s reward mechanism for alignment. The universe seeks harmonization. Where else do we go after a big bang? Back to convergence, rinse and repeat. The balance between extremes is found in the aggregate across cycles. Remember, the architecture is the same up and down the stack.\r\n - https://bigthink.com/hard-science/physicist-radical-theory-of-gravity/",
"editedAt": "2022-07-22T01:00:24Z"
},
{
"diff": "# 2022-07-21 Engineering Logs\r\n\r\n- https://lists.spdx.org/g/Spdx-tech/message/4673\r\n - > It is not just a matter of your software, it is a fundamental design question whether to maintain separation between the logical model and its serializations. Maintaining separation shouldn't be a matter of personal preference, it's good software engineering. The OWL Web Ontology Language https://www.w3.org/TR/owl2-overview/ has an excellent diagram illustrating the separation between semantics and syntax. Several serializations are defined in OWL (Manchester Syntax, Functional Syntax, RDF/XML, OWL/XML, and Turtle), and more syntaxes have been added since (JSON-LD, RDF-star, ...).\r\n- Knock and the gates will be opened. This morning they were. The beauty of the light is blinding.\r\n- Gravity: our top level system context\u2019s reward mechanism for alignment. The universe seeks harmonization. Where else do we go after a big bang? Back to convergence, rinse and repeat. The balance between extremes is found in the aggregate across cycles. Remember, the architecture is the same up and down the stack.\r\n - https://bigthink.com/hard-science/physicist-radical-theory-of-gravity/",
"editedAt": "2022-07-21T22:13:06Z"
},
{
"diff": "# 2022-07-21\r\n\r\n- Knock and the gates will be opened. This morning they were. The beauty of the light is blinding.\r\n- Gravity: our top level system context\u2019s reward mechanism for alignment. The universe seeks harmonization. Where else do we go after a big bang? Back to convergence, rinse and repeat. The balance between extremes is found in the aggregate across cycles. Remember, the architecture is the same up and down the stack.\r\n - https://bigthink.com/hard-science/physicist-radical-theory-of-gravity/",
"editedAt": "2022-07-21T22:07:12Z"
},
{
"diff": "# 2022-07-21\r\n\r\n- Knock and the gates will be opened. This morning they were.\r\n- Gravity: our top level system context\u2019s reward mechanism for alignment. The universe seeks harmonization. Where else do we go after a big bang? Back to convergence, rinse and repeat. The balance between extremes is found in the aggregate across cycles. Remember, the architecture is the same up and down the stack.\r\n - https://bigthink.com/hard-science/physicist-radical-theory-of-gravity/",
"editedAt": "2022-07-21T22:06:00Z"
},
{
"diff": "# 2022-07-21\r\n\r\n- Gravity: our top level system context\u2019s reward mechanism for alignment. The universe seeks harmonization. Where else do we go after a big bang? Back to convergence, rinse and repeat. The balance between extremes is found in the aggregate across cycles. Remember, the architecture is the same up and down the stack.\r\n - https://bigthink.com/hard-science/physicist-radical-theory-of-gravity/",
"editedAt": "2022-07-21T22:04:10Z"
}
]
},
"replies": {
"pageInfo": {
"endCursor": null,
"hasNextPage": false
},
"totalCount": 0,
"nodes": []
}
},
{
"body": "# 2022-07-23\r\n\r\n- https://blog.ciaranmcnulty.com/2022-05-12-multiple-build-contexts\r\n- The Entire Universe Is Intelligent, Our Every Thought Is Connected With Distant Worlds\r\n - Yup, the architecture scales\r\n - https://siamtoo.com/6805/",
"createdAt": "2022-07-23T17:54:28Z",
"userContentEdits": {
"pageInfo": {
"endCursor": "Y3Vyc29yOnYyOpHOABOuiA==",
"hasNextPage": false
},
"totalCount": 2,
"nodes": [
{
"diff": "# 2022-07-23\r\n\r\n- https://blog.ciaranmcnulty.com/2022-05-12-multiple-build-contexts\r\n- The Entire Universe Is Intelligent, Our Every Thought Is Connected With Distant Worlds\r\n - Yup, the architecture scales\r\n - https://siamtoo.com/6805/",
"editedAt": "2022-07-26T03:44:50Z"
},
{
"diff": "# 2022-07-23\r\n\r\n- https://blog.ciaranmcnulty.com/2022-05-12-multiple-build-contexts",
"editedAt": "2022-07-23T17:54:28Z"
}
]
},
"replies": {
"pageInfo": {
"endCursor": null,
"hasNextPage": false
},
"totalCount": 0,
"nodes": []
}
},
{
"body": "# 2022-07-28 Alice Initiative/Open Architecture Working Group Initial Meeting\r\n\r\n- Meeting info\r\n - 8-9 AM Pacific\r\n - https://meet.google.com/kox-ssqn-kjd",
"createdAt": "2022-07-24T15:54:29Z",
"userContentEdits": {
"pageInfo": {
"endCursor": "Y3Vyc29yOnYyOpHOABOm3g==",
"hasNextPage": false
},
"totalCount": 2,
"nodes": [
{
"diff": "# 2022-07-28 Alice Initiative/Open Architecture Working Group Initial Meeting\r\n\r\n- Meeting info\r\n - 8-9 AM Pacific\r\n - https://meet.google.com/kox-ssqn-kjd",
"editedAt": "2022-07-25T13:47:53Z"
},
{
"diff": "# 2022-07-28 Alice Initiative Working Group Meeting\r\n\r\n- Meeting info\r\n - 8-9 AM Pacific\r\n - https://meet.google.com/kox-ssqn-kjd",
"editedAt": "2022-07-24T15:54:29Z"
}
]
},
"replies": {
"pageInfo": {
"endCursor": null,
"hasNextPage": false
},
"totalCount": 0,
"nodes": []
}
},
{
"body": "# 2022-07-25 Engineering Logs\r\n\r\n- TODO\r\n - [x] @aliceoa, @pdxjohnny: Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] @pdxjohnny: Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] @dffml: Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.",
"createdAt": "2022-07-25T14:50:20Z",
"userContentEdits": {
"pageInfo": {
"endCursor": null,
"hasNextPage": false
},
"totalCount": 0,
"nodes": []
},
"replies": {
"pageInfo": {
"endCursor": "Y3Vyc29yOnYyOpK5MjAyMi0wNy0yNVQwODowMzo0My0wNzowMM4AMS9B",
"hasNextPage": false
},
"totalCount": 2,
"nodes": [
{
"body": "## 2022-07-25 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n- References\r\n - https://spdx.github.io/canonical-serialisation/",
"createdAt": "2022-07-25T14:50:57Z",
"userContentEdits": {
"pageInfo": {
"endCursor": "Y3Vyc29yOnYyOpHOABOnvA==",
"hasNextPage": false
},
"totalCount": 3,
"nodes": [
{
"diff": "## 2022-07-25 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n- References\r\n - https://spdx.github.io/canonical-serialisation/",
"editedAt": "2022-07-25T17:34:22Z"
},
{
"diff": "## 2022-07-25 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`",
"editedAt": "2022-07-25T14:51:12Z"
},
{
"diff": "## 2022-07-25 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [ ] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`",
"editedAt": "2022-07-25T14:50:57Z"
}
]
}
},
{
"body": "## 2022-07-25 Supply Chain Integrity, Transparency and Trust (SCITT)\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n- Links\r\n - https://github.com/intel/dffml/discussions/1406#discussioncomment-3181956\r\n - https://github.com/intel/dffml/discussions/1406#discussioncomment-3223247\r\n - https://github.com/transmute-industries/openssl-did-web-tutorial\r\n- Folks will be in Philly on Thursday for meeting\r\n - There is a remote link (see mailing list?) for the Thursday meeting\r\n- Typical global warming centric chit chat\r\n- Others at RATs meeting or busy with other IETF activities\r\n- Introductions\r\n - Kelvin Cusack\r\n - Filling in for John from his org\r\n - John Andersen\r\n - Connecting dots between this and OpenSSF\r\n - Yogesh was excited to see someone from Intel\r\n - Intel is involved in RATs but not as much here\r\n - Kiran Karunakaran\r\n - Microsoft\r\n - On Kay Williams's team\r\n - Will likely lead this meeting in the future\r\n- Upcoming Birds of Feather (BoF)\r\n - You need to register here: https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit?pli=1#heading=h.214jg0n2xjhp\r\n - There is registration for remote\r\n - Problem Statement currently scoped around software\r\n - We went back to a more scoped problem statement is that we want to form a formal working group in the IETF for SCITT.\r\n - In order to form it we have to have stuffiest people show interest in the problem space\r\n - Need problem space and charter so that it's scoped enough that the leadership is confident that the group can make progress\r\n - The early design proposal for the SCITT transparency service is that the service is content agnostic\r\n - any kind of metadata could be provided and retrieved\r\n - SBOMs, Software Test Cases, Hardware BOMs, Test results on other types of products\r\n - Electronic ballots (Ray, see mailing list), oil, gas, physical goods (U.S. govt.)\r\n - In order to gain confidence from the leadership at IETF to form the WG we felt it was critical to narrow the scope for now to software\r\n - Leadership thought scope was too big at first\r\n- Thoughts around scope\r\n - Charter is focused on software, an attainable goal\r\n - Once we have a WG we can later broaden the scope via re-chartering\r\n- We will design something that works for hardware and for software\r\n - We are hanging software window curtains but we are looking at everything\r\n- Software systems interact with everything else\r\n - Dick Brooks (REA), any manifest could be signed and processes with SCITT\r\n - It's just metadata, what was issued, how it was issued\r\n - @dffml: We are encoding system contexts, Alice, into the chain, one place where she will live.\r\n- Opens\r\n - Open Policy Agent (mentioned in meeting minutes doc future topics)\r\n - What are your plans / thoughts around Open Policy Agent and encoding policies into SCITT?\r\n - Policy can be used in two places\r\n - Policy for what can be put onto register\r\n - Some registries might constrain themselves for what types of data they allow\r\n - Policy for someone evaluating the contents of the registry to make discussions for fitness of use\r\n - REGO also considered as a policy language\r\n - Perhaps decide on multi\r\n - This policy discussion will happen in this WG for now, then maybe a sub working group\r\n - Dick Brooks: Mentions HR4081\r\n - On topic for what we are \r\n - Talked about attestations for devices containing a camera and microphone and is connected to the internet\r\n - There will need to be an attestation from the device \r\n - Dick submitted to local rep to include software attestations as well\r\n - https://www.congress.gov/bill/117th-congress/house-bill/4081\r\nhttps://www.congress.gov/bill/117th-congress/house-bill/4081/text\r\n- Producers and other parties provide content into the system, attestations, claims, recorded into SCITT ledger\r\n- Dick contacted his congress person to ask to add an amendment to HR4081\r\n - Amendment for smartphone apps to provide a trust score\r\n - Tie in with OpenSSF metrics database to grab the security of repos involved\r\n - Dick\r\n - Proposed amendment I mentioned for HR 4081:\r\n - \"Require smart phone app stores to include a software supply chain trust score for each app\". This gives consumers the ability to check trustworthiness before installing an app,\r\n- Ray: think about attestations is different than the transparency ledger\r\n - Thinks it's a lot to bite off to do both\r\n - Are there IoT folks that might have more attestation experience we could tap into?\r\n - Is there a sub-working group focused on device attestations (in response to HR4081)\r\n - Device attestations could be recorded tin the transparency ledger\r\n - TCG DICE WG is a target point of engagement (UCAN, CBOR, DID).\r\n - https://trustedcomputinggroup.org/work-groups/dice-architectures/\r\n - https://www.trustedcomputinggroup.org/wp-content/uploads/Device-Identifier-Composition-Engine-Rev69_Public-Review.pdf\r\n - Looking at hardware actively attesting\r\n - Microsoft has an Open Source implementation on GitHub\r\n - This attestation stuff starts to look at real life commerce, Ray thinks it's important to \r\n- Joshua Lock\r\n - On software attestations, I have been working on a page for the SLSA website to describe the model we're working with. I can share to the SCITT list once the change is merged.\r\n - IIRC the SCITT draft standards refer to a 'software attestation\" as a \"claim\", to disambiguate from RATS & TCG attestations\r\n- Remote Attestation and Device Attestation\r\n - Embraced COSE and CBOR\r\n - Also in SCITT\r\n - Hopefully we converge on underlying formats for both in-toto style and remote attestation style attestations\r\n- There are also NIST attestations\r\n- Vuln information mentioned by Kay as possible content inserted into SCITT\r\n - This is a goal of ours with our CVE Binary Tool engagement\r\n - We also could encode SBOMs from the systems that built them, we could patch sigstore to insert into a SCITT ledger",
"createdAt": "2022-07-25T15:03:43Z",
"userContentEdits": {
"pageInfo": {
"endCursor": "Y3Vyc29yOnYyOpHOABOn8A==",
"hasNextPage": false
},
"totalCount": 13,
"nodes": [
{
"diff": "## 2022-07-25 Supply Chain Integrity, Transparency and Trust (SCITT)\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n- Links\r\n - https://github.com/intel/dffml/discussions/1406#discussioncomment-3181956\r\n - https://github.com/intel/dffml/discussions/1406#discussioncomment-3223247\r\n - https://github.com/transmute-industries/openssl-did-web-tutorial\r\n- Folks will be in Philly on Thursday for meeting\r\n - There is a remote link (see mailing list?) for the Thursday meeting\r\n- Typical global warming centric chit chat\r\n- Others at RATs meeting or busy with other IETF activities\r\n- Introductions\r\n - Kelvin Cusack\r\n - Filling in for John from his org\r\n - John Andersen\r\n - Connecting dots between this and OpenSSF\r\n - Yogesh was excited to see someone from Intel\r\n - Intel is involved in RATs but not as much here\r\n - Kiran Karunakaran\r\n - Microsoft\r\n - On Kay Williams's team\r\n - Will likely lead this meeting in the future\r\n- Upcoming Birds of Feather (BoF)\r\n - You need to register here: https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit?pli=1#heading=h.214jg0n2xjhp\r\n - There is registration for remote\r\n - Problem Statement currently scoped around software\r\n - We went back to a more scoped problem statement is that we want to form a formal working group in the IETF for SCITT.\r\n - In order to form it we have to have stuffiest people show interest in the problem space\r\n - Need problem space and charter so that it's scoped enough that the leadership is confident that the group can make progress\r\n - The early design proposal for the SCITT transparency service is that the service is content agnostic\r\n - any kind of metadata could be provided and retrieved\r\n - SBOMs, Software Test Cases, Hardware BOMs, Test results on other types of products\r\n - Electronic ballots (Ray, see mailing list), oil, gas, physical goods (U.S. govt.)\r\n - In order to gain confidence from the leadership at IETF to form the WG we felt it was critical to narrow the scope for now to software\r\n - Leadership thought scope was too big at first\r\n- Thoughts around scope\r\n - Charter is focused on software, an attainable goal\r\n - Once we have a WG we can later broaden the scope via re-chartering\r\n- We will design something that works for hardware and for software\r\n - We are hanging software window curtains but we are looking at everything\r\n- Software systems interact with everything else\r\n - Dick Brooks (REA), any manifest could be signed and processes with SCITT\r\n - It's just metadata, what was issued, how it was issued\r\n - @dffml: We are encoding system contexts, Alice, into the chain, one place where she will live.\r\n- Opens\r\n - Open Policy Agent (mentioned in meeting minutes doc future topics)\r\n - What are your plans / thoughts around Open Policy Agent and encoding policies into SCITT?\r\n - Policy can be used in two places\r\n - Policy for what can be put onto register\r\n - Some registries might constrain themselves for what types of data they allow\r\n - Policy for someone evaluating the contents of the registry to make discussions for fitness of use\r\n - REGO also considered as a policy language\r\n - Perhaps decide on multi\r\n - This policy discussion will happen in this WG for now, then maybe a sub working group\r\n - Dick Brooks: Mentions HR4081\r\n - On topic for what we are \r\n - Talked about attestations for devices containing a camera and microphone and is connected to the internet\r\n - There will need to be an attestation from the device \r\n - Dick submitted to local rep to include software attestations as well\r\n - https://www.congress.gov/bill/117th-congress/house-bill/4081\r\nhttps://www.congress.gov/bill/117th-congress/house-bill/4081/text\r\n- Producers and other parties provide content into the system, attestations, claims, recorded into SCITT ledger\r\n- Dick contacted his congress person to ask to add an amendment to HR4081\r\n - Amendment for smartphone apps to provide a trust score\r\n - Tie in with OpenSSF metrics database to grab the security of repos involved\r\n - Dick\r\n - Proposed amendment I mentioned for HR 4081:\r\n - \"Require smart phone app stores to include a software supply chain trust score for each app\". This gives consumers the ability to check trustworthiness before installing an app,\r\n- Ray: think about attestations is different than the transparency ledger\r\n - Thinks it's a lot to bite off to do both\r\n - Are there IoT folks that might have more attestation experience we could tap into?\r\n - Is there a sub-working group focused on device attestations (in response to HR4081)\r\n - Device attestations could be recorded tin the transparency ledger\r\n - TCG DICE WG is a target point of engagement (UCAN, CBOR, DID).\r\n - https://trustedcomputinggroup.org/work-groups/dice-architectures/\r\n - https://www.trustedcomputinggroup.org/wp-content/uploads/Device-Identifier-Composition-Engine-Rev69_Public-Review.pdf\r\n - Looking at hardware actively attesting\r\n - Microsoft has an Open Source implementation on GitHub\r\n - This attestation stuff starts to look at real life commerce, Ray thinks it's important to \r\n- Joshua Lock\r\n - On software attestations, I have been working on a page for the SLSA website to describe the model we're working with. I can share to the SCITT list once the change is merged.\r\n - IIRC the SCITT draft standards refer to a 'software attestation\" as a \"claim\", to disambiguate from RATS & TCG attestations\r\n- Remote Attestation and Device Attestation\r\n - Embraced COSE and CBOR\r\n - Also in SCITT\r\n - Hopefully we converge on underlying formats for both in-toto style and remote attestation style attestations\r\n- There are also NIST attestations\r\n- Vuln information mentioned by Kay as possible content inserted into SCITT\r\n - This is a goal of ours with our CVE Binary Tool engagement\r\n - We also could encode SBOMs from the systems that built them, we could patch sigstore to insert into a SCITT ledger",
"editedAt": "2022-07-25T16:42:05Z"
},
{
"diff": "## 2022-07-25 Supply Chain Integrity, Transparency and Trust (SCITT)\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n- Links\r\n - https://github.com/intel/dffml/discussions/1406#discussioncomment-3181956\r\n - https://github.com/intel/dffml/discussions/1406#discussioncomment-3223247\r\n- Folks will be in Philly on Thursday for meeting\r\n - There is a remote link (see mailing list?) for the Thursday meeting\r\n- Typical global warming centric chit chat\r\n- Others at RATs meeting or busy with other IETF activities\r\n- Introductions\r\n - Kelvin Cusack\r\n - Filling in for John from his org\r\n - John Andersen\r\n - Connecting dots between this and OpenSSF\r\n - Yogesh was excited to see someone from Intel\r\n - Intel is involved in RATs but not as much here\r\n - Kiran Karunakaran\r\n - Microsoft\r\n - On Kay Williams's team\r\n - Will likely lead this meeting in the future\r\n- Upcoming Birds of Feather (BoF)\r\n - You need to register here: https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit?pli=1#heading=h.214jg0n2xjhp\r\n - There is registration for remote\r\n - Problem Statement currently scoped around software\r\n - We went back to a more scoped problem statement is that we want to form a formal working group in the IETF for SCITT.\r\n - In order to form it we have to have stuffiest people show interest in the problem space\r\n - Need problem space and charter so that it's scoped enough that the leadership is confident that the group can make progress\r\n - The early design proposal for the SCITT transparency service is that the service is content agnostic\r\n - any kind of metadata could be provided and retrieved\r\n - SBOMs, Software Test Cases, Hardware BOMs, Test results on other types of products\r\n - Electronic ballots (Ray, see mailing list), oil, gas, physical goods (U.S. govt.)\r\n - In order to gain confidence from the leadership at IETF to form the WG we felt it was critical to narrow the scope for now to software\r\n - Leadership thought scope was too big at first\r\n- Thoughts around scope\r\n - Charter is focused on software, an attainable goal\r\n - Once we have a WG we can later broaden the scope via re-chartering\r\n- We will design something that works for hardware and for software\r\n - We are hanging software window curtains but we are looking at everything\r\n- Software systems interact with everything else\r\n - Dick Brooks (REA), any manifest could be signed and processes with SCITT\r\n - It's just metadata, what was issued, how it was issued\r\n - @dffml: We are encoding system contexts, Alice, into the chain, one place where she will live.\r\n- Opens\r\n - Open Policy Agent (mentioned in meeting minutes doc future topics)\r\n - What are your plans / thoughts around Open Policy Agent and encoding policies into SCITT?\r\n - Policy can be used in two places\r\n - Policy for what can be put onto register\r\n - Some registries might constrain themselves for what types of data they allow\r\n - Policy for someone evaluating the contents of the registry to make discussions for fitness of use\r\n - REGO also considered as a policy language\r\n - Perhaps decide on multi\r\n - This policy discussion will happen in this WG for now, then maybe a sub working group\r\n - Dick Brooks: Mentions HR4081\r\n - On topic for what we are \r\n - Talked about attestations for devices containing a camera and microphone and is connected to the internet\r\n - There will need to be an attestation from the device \r\n - Dick submitted to local rep to include software attestations as well\r\n - https://www.congress.gov/bill/117th-congress/house-bill/4081\r\nhttps://www.congress.gov/bill/117th-congress/house-bill/4081/text\r\n- Producers and other parties provide content into the system, attestations, claims, recorded into SCITT ledger\r\n- Dick contacted his congress person to ask to add an amendment to HR4081\r\n - Amendment for smartphone apps to provide a trust score\r\n - Tie in with OpenSSF metrics database to grab the security of repos involved\r\n - Dick\r\n - Proposed amendment I mentioned for HR 4081:\r\n - \"Require smart phone app stores to include a software supply chain trust score for each app\". This gives consumers the ability to check trustworthiness before installing an app,\r\n- Ray: think about attestations is different than the transparency ledger\r\n - Thinks it's a lot to bite off to do both\r\n - Are there IoT folks that might have more attestation experience we could tap into?\r\n - Is there a sub-working group focused on device attestations (in response to HR4081)\r\n - Device attestations could be recorded tin the transparency ledger\r\n - TCG DICE WG is a target point of engagement (UCAN, CBOR, DID).\r\n - https://trustedcomputinggroup.org/work-groups/dice-architectures/\r\n - https://www.trustedcomputinggroup.org/wp-content/uploads/Device-Identifier-Composition-Engine-Rev69_Public-Review.pdf\r\n - Looking at hardware actively attesting\r\n - Microsoft has an Open Source implementation on GitHub\r\n - This attestation stuff starts to look at real life commerce, Ray thinks it's important to \r\n- Joshua Lock\r\n - On software attestations, I have been working on a page for the SLSA website to describe the model we're working with. I can share to the SCITT list once the change is merged.\r\n - IIRC the SCITT draft standards refer to a 'software attestation\" as a \"claim\", to disambiguate from RATS & TCG attestations\r\n- Remote Attestation and Device Attestation\r\n - Embraced COSE and CBOR\r\n - Also in SCITT\r\n - Hopefully we converge on underlying formats for both in-toto style and remote attestation style attestations\r\n- There are also NIST attestations\r\n- Vuln information mentioned by Kay as possible content inserted into SCITT\r\n - This is a goal of ours with our CVE Binary Tool engagement\r\n - We also could encode SBOMs from the systems that built them, we could patch sigstore to insert into a SCITT ledger",
"editedAt": "2022-07-25T16:20:07Z"
},
{
"diff": "## 2022-07-25 Supply Chain Integrity, Transparency and Trust (SCITT)\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n- Links\r\n - https://github.com/intel/dffml/discussions/1406#discussioncomment-3181956\r\n - https://github.com/intel/dffml/discussions/1406#discussioncomment-3223247\r\n- Folks will be in Philly on Thursday for meeting\r\n - There is a remote link (see mailing list?) for the Thursday meeting\r\n- Typical global warming centric chit chat\r\n- Others at RATs meeting or busy with other IETF activities\r\n- Introductions\r\n - Kelvin Cusack\r\n - Filling in for John from his org\r\n - John Andersen\r\n - Connecting dots between this and OpenSSF\r\n - Yogesh was excited to see someone from Intel\r\n - Intel is involved in RATs but not as much here\r\n - Kiran Karunakaran\r\n - Microsoft\r\n - On Kay Williams's team\r\n - Will likely lead this meeting in the future\r\n- Upcoming Birds of Feather (BoF)\r\n - You need to register here: https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit?pli=1#heading=h.214jg0n2xjhp\r\n - There is registration for remote\r\n - Problem Statement currently scoped around software\r\n - We went back to a more scoped problem statement is that we want to form a formal working group in the IETF for SCITT.\r\n - In order to form it we have to have stuffiest people show interest in the problem space\r\n - Need problem space and charter so that it's scoped enough that the leadership is confident that the group can make progress\r\n - The early design proposal for the SCITT transparency service is that the service is content agnostic\r\n - any kind of metadata could be provided and retrieved\r\n - SBOMs, Software Test Cases, Hardware BOMs, Test results on other types of products\r\n - Electronic ballots (Ray, see mailing list), oil, gas, physical goods (U.S. govt.)\r\n - In order to gain confidence from the leadership at IETF to form the WG we felt it was critical to narrow the scope for now to software\r\n - Leadership thought scope was too big at first\r\n- Thoughts around scope\r\n - Charter is focused on software, an attainable goal\r\n - Once we have a WG we can later broaden the scope via re-chartering\r\n- We will design something that works for hardware and for software\r\n - We are hanging software window curtains but we are looking at everything\r\n- Software systems interact with everything else\r\n - Dick Brooks (REA), any manifest could be signed and processes with SCITT\r\n - It's just metadata, what was issued, how it was issued\r\n - @dffml: We are encoding system contexts, Alice, into the chain, one place where she will live.\r\n- Opens\r\n - Open Policy Agent (mentioned in meeting minutes doc future topics)\r\n - What are your plans / thoughts around Open Policy Agent and encoding policies into SCITT?\r\n - Policy can be used in two places\r\n - Policy for what can be put onto register\r\n - Some registries might constrain themselves for what types of data they allow\r\n - Policy for someone evaluating the contents of the registry to make discussions for fitness of use\r\n - REGO also considered as a policy language\r\n - Perhaps decide on multi\r\n - This policy discussion will happen in this WG for now, then maybe a sub working group\r\n - Dick Brooks: Mentions HR4081\r\n - On topic for what we are \r\n - Talked about attestations for devices containing a camera and microphone and is connected to the internet\r\n - There will need to be an attestation from the device \r\n - Dick submitted to local rep to include software attestations as well\r\n - https://www.congress.gov/bill/117th-congress/house-bill/4081\r\nhttps://www.congress.gov/bill/117th-congress/house-bill/4081/text\r\n- Producers and other parties provide content into the system, attestations, claims, recorded into SCITT ledger\r\n- Dick said that you an contact your congress person to make sure we can see an impact because there is\r\n - Amendment for smartphone apps to provide a trust score\r\n - Tie in with OpenSSF metrics database to grab the security of repos involved\r\n - Dick\r\n - Proposed amendment I mentioned for HR 4081:\r\n - \"Require smart phone app stores to include a software supply chain trust score for each app\". This gives consumers the ability to check trustworthiness before installing an app,\r\n- Ray: think about attestations is different than the transparency ledger\r\n - Thinks it's a lot to bite off to do both\r\n - Are there IoT folks that might have more attestation experience we could tap into?\r\n - Is there a sub-working group focused on device attestations (in response to HR4081)\r\n - Device attestations could be recorded tin the transparency ledger\r\n - TCG DICE WG is a target point of engagement (UCAN, CBOR, DID).\r\n - https://trustedcomputinggroup.org/work-groups/dice-architectures/\r\n - https://www.trustedcomputinggroup.org/wp-content/uploads/Device-Identifier-Composition-Engine-Rev69_Public-Review.pdf\r\n - Looking at hardware actively attesting\r\n - Microsoft has an Open Source implementation on GitHub\r\n - This attestation stuff starts to look at real life commerce, Ray thinks it's important to \r\n- Joshua Lock\r\n - On software attestations, I have been working on a page for the SLSA website to describe the model we're working with. I can share to the SCITT list once the change is merged.\r\n - IIRC the SCITT draft standards refer to a 'software attestation\" as a \"claim\", to disambiguate from RATS & TCG attestations\r\n- Remote Attestation and Device Attestation\r\n - Embraced COSE and CBOR\r\n - Also in SCITT\r\n - Hopefully we converge on underlying formats for both in-toto style and remote attestation style attestations\r\n- There are also NIST attestations\r\n- Vuln information mentioned by Kay as possible content inserted into SCITT\r\n - This is a goal of ours with our CVE Binary Tool engagement\r\n - We also could encode SBOMs from the systems that built them, we could patch sigstore to insert into a SCITT ledger",
"editedAt": "2022-07-25T16:19:14Z"
},
{
"diff": "## 2022-07-25 Supply Chain Integrity, Transparency and Trust (SCITT)\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n- Links\r\n - https://github.com/intel/dffml/discussions/1406#discussioncomment-3181956\r\n - https://github.com/intel/dffml/discussions/1406#discussioncomment-3223247\r\n- Folks will be in Philly on Thursday for meeting\r\n - There is a remote link (see mailing list?) for the Thursday meeting\r\n- Global warming discussions\r\n- Others at RATs meeting or busy with other IETF activities\r\n- Introductions\r\n - Kelvin Cusack\r\n - Filling in for John from his org\r\n - John Andersen\r\n - Connecting dots between this and OpenSSF\r\n - Yogesh was excited to see someone from Intel\r\n - Intel is involved in RATs but not as much here\r\n - Kiran Karunakaran\r\n - Microsoft\r\n - On Kay Williams's team\r\n - Will likely lead this meeting in the future\r\n- Upcoming Birds of Feather (BoF)\r\n - You need to register here: https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit?pli=1#heading=h.214jg0n2xjhp\r\n - There is registration for remote\r\n - Problem Statement currently scoped around software\r\n - We went back to a more scoped problem statement is that we want to form a formal working group in the IETF for SCITT.\r\n - In order to form it we have to have stuffiest people show interest in the problem space\r\n - Need problem space and charter so that it's scoped enough that the leadership is confident that the group can make progress\r\n - The early design proposal for the SCITT transparency service is that the service is content agnostic\r\n - any kind of metadata could be provided and retrieved\r\n - SBOMs, Software Test Cases, Hardware BOMs, Test results on other types of products\r\n - Electronic ballots (Ray, see mailing list), oil, gas, physical goods (U.S. govt.)\r\n - In order to gain confidence from the leadership at IETF to form the WG we felt it was critical to narrow the scope for now to software\r\n - Leadership thought scope was too big at first\r\n- Thoughts around scope\r\n - Charter is focused on software, an attainable goal\r\n - Once we have a WG we can later broaden the scope via re-chartering\r\n- We will design something that works for hardware and for software\r\n - We are hanging software window curtains but we are looking at everything\r\n- Software systems interact with everything else\r\n - Dick Brooks (REA), any manifest could be signed and processes with SCITT\r\n - It's just metadata, what was issued, how it was issued\r\n - @dffml: We are encoding system contexts, Alice, into the chain, one place where she will live.\r\n- Opens\r\n - Open Policy Agent (mentioned in meeting minutes doc future topics)\r\n - What are your plans / thoughts around Open Policy Agent and encoding policies into SCITT?\r\n - Policy can be used in two places\r\n - Policy for what can be put onto register\r\n - Some registries might constrain themselves for what types of data they allow\r\n - Policy for someone evaluating the contents of the registry to make discussions for fitness of use\r\n - REGO also considered as a policy language\r\n - Perhaps decide on multi\r\n - This policy discussion will happen in this WG for now, then maybe a sub working group\r\n - Dick Brooks: Mentions HR4081\r\n - On topic for what we are \r\n - Talked about attestations for devices containing a camera and microphone and is connected to the internet\r\n - There will need to be an attestation from the device \r\n - Dick submitted to local rep to include software attestations as well\r\n - https://www.congress.gov/bill/117th-congress/house-bill/4081\r\nhttps://www.congress.gov/bill/117th-congress/house-bill/4081/text\r\n- Producers and other parties provide content into the system, attestations, claims, recorded into SCITT ledger\r\n- Dick said that you an contact your congress person to make sure we can see an impact because there is\r\n - Amendment for smartphone apps to provide a trust score\r\n - Tie in with OpenSSF metrics database to grab the security of repos involved\r\n - Dick\r\n - Proposed amendment I mentioned for HR 4081:\r\n - \"Require smart phone app stores to include a software supply chain trust score for each app\". This gives consumers the ability to check trustworthiness before installing an app,\r\n- Ray: think about attestations is different than the transparency ledger\r\n - Thinks it's a lot to bite off to do both\r\n - Are there IoT folks that might have more attestation experience we could tap into?\r\n - Is there a sub-working group focused on device attestations (in response to HR4081)\r\n - Device attestations could be recorded tin the transparency ledger\r\n - TCG DICE WG is a target point of engagement (UCAN, CBOR, DID).\r\n - https://trustedcomputinggroup.org/work-groups/dice-architectures/\r\n - https://www.trustedcomputinggroup.org/wp-content/uploads/Device-Identifier-Composition-Engine-Rev69_Public-Review.pdf\r\n - Looking at hardware actively attesting\r\n - Microsoft has an Open Source implementation on GitHub\r\n - This attestation stuff starts to look at real life commerce, Ray thinks it's important to \r\n- Joshua Lock\r\n - On software attestations, I have been working on a page for the SLSA website to describe the model we're working with. I can share to the SCITT list once the change is merged.\r\n - IIRC the SCITT draft standards refer to a 'software attestation\" as a \"claim\", to disambiguate from RATS & TCG attestations\r\n- Remote Attestation and Device Attestation\r\n - Embraced COSE and CBOR\r\n - Also in SCITT\r\n - Hopefully we converge on underlying formats for both in-toto style and remote attestation style attestations\r\n- There are also NIST attestations\r\n- Vuln information mentioned by Kay as possible content inserted into SCITT\r\n - This is a goal of ours with our CVE Binary Tool engagement\r\n - We also could encode SBOMs from the systems that built them, we could patch sigstore to insert into a SCITT ledger",
"editedAt": "2022-07-25T16:18:37Z"
},
{
"diff": "## 2022-07-25 Supply Chain Integrity, Transparency and Trust (SCITT)\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n- Links\r\n - https://github.com/intel/dffml/discussions/1406#discussioncomment-3181956\r\n - https://github.com/intel/dffml/discussions/1406#discussioncomment-3223247\r\n- Folks will be in Philly on Thursday for meeting\r\n - There is a remote link (see mailing list?) for the Thursday meeting\r\n- Global warming discussions\r\n- Others at RATs meeting or busy with other IETF activities\r\n- Introductions\r\n - Kelvin Cusack\r\n - Filling in for John from his org\r\n - John Andersen\r\n - Connecting dots between this and OpenSSF\r\n - Yogesh was excited to see someone from Intel\r\n - Intel is involved in RATs but not as much here\r\n - Kiran Karunakaran\r\n - Microsoft\r\n - On Kay Williams's team\r\n - Will likely lead this meeting in the future\r\n- Upcoming Birds of Feather (BoF)\r\n - You need to register here: https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit?pli=1#heading=h.214jg0n2xjhp\r\n - There is registration for remote\r\n - Problem Statement currently scoped around software\r\n - We went back to a more scoped problem statement is that we want to form a formal working group in the IETF for SCITT.\r\n - In order to form it we have to have stuffiest people show interest in the problem space\r\n - Need problem space and charter so that it's scoped enough that the leadership is confident that the group can make progress\r\n - The early design proposal for the SCITT transparency service is that the service is content agnostic\r\n - any kind of metadata could be provided and retrieved\r\n - SBOMs, Software Test Cases, Hardware BOMs, Test results on other types of products\r\n - Electronic ballots (Ray, see mailing list), oil, gas, physical goods (U.S. govt.)\r\n - In order to gain confidence from the leadership at IETF to form the WG we felt it was critical to narrow the scope for now to software\r\n - Leadership thought scope was too big at first\r\n- Thoughts around scope\r\n - Charter is focused on software, an attainable goal\r\n - Once we have a WG we can later broaden the scope via re-chartering\r\n- We will design something that works for hardware and for software\r\n - We are hanging software window curtains but we are looking at everything\r\n- Software systems interact with everything else\r\n - Dick Brooks (REA), any manifest could be signed and processes with SCITT\r\n - It's just metadata, what was issued, how it was issued\r\n - @dffml: We are encoding system contexts, Alice, into the chain, one place where she will live.\r\n- Opens\r\n - Open Policy Agent (mentioned in meeting minutes doc future topics)\r\n - What are your plans / thoughts around Open Policy Agent and encoding policies into SCITT?\r\n - Policy can be used in two places\r\n - Policy for what can be put onto register\r\n - Some registries might constrain themselves for what types of data they allow\r\n - Policy for someone evaluating the contents of the registry to make discussions for fitness of use\r\n - REGO also considered as a policy language\r\n - Perhaps decide on multi\r\n - This policy discussion will happen in this WG for now, then maybe a sub working group\r\n - Dick Brooks: Metions HR4081\r\n - On topic for what we are \r\n - Talked about attestations for devices containing a camera and microphone and is connected to the internet\r\n - There will need to be an attestation from the device \r\n - Dick submitted to local rep to include software attestations as well\r\n - https://www.congress.gov/bill/117th-congress/house-bill/4081\r\nhttps://www.congress.gov/bill/117th-congress/house-bill/4081/text\r\n- Producers and other parties provide content into the system, attestations, claims, recorded into SCITT ledger\r\n- Dick said that you an contact your congress person to make sure we can see an impact because there is\r\n - Amendment for smartphone apps to provide a trust score\r\n - Tie in with OpenSSF metrics database to grab the security of repos involved\r\n - Dick\r\n - Proposed amendment I mentioned for HR 4081:\r\n - \"Require smart phone app stores to include a software supply chain trust score for each app\". This gives consumers the ability to check trustworthiness before installing an app,\r\n- Ray: thinkg about attestations is different than the trnsparency ledgre\r\n - Thinks it's a lot to bite off to do both\r\n - Are there IoT folks that might have more attestation experiance we could tap into?\r\n - Is there a sub-working group focused on device attetations (in response to HR4081)\r\n - Devicde attestations could be recorded tin the trnasparency ledgre\r\n - The device atteatestion itself it is diffciult to understand if it's really \r\n - TCG DICE WG is a target point of engagement (UCAN, CBOR, DID).\r\n - https://trustedcomputinggroup.org/work-groups/dice-architectures/\r\n - https://www.trustedcomputinggroup.org/wp-content/uploads/Device-Identifier-Composition-Engine-Rev69_Public-Review.pdf\r\n - Looking at hardware actively attesting\r\n - Microsoft has an Open Source implementation on GitHub\r\n - This attestation stuff starts to look at real life commerce, Ray thinks it's important to \r\n- Joshua Lock\r\n - On software attestations, I have been working on a page for the SLSA website to describe the model we're working with. I can share to the SCITT list once the change is merged.\r\n - IIRC the SCITT draft standards refer to a 'software attestation\" as a \"claim\", to disambiguate from RATS & TCG attestations\r\n- Remote Attestation and Device Attestation\r\n - Embraced COSE and CBOR\r\n - Also in SCITT\r\n - Hopefully we converge on underlying formats for both in-toto style and remote attestation style attestations\r\n- There are also NIST attestations\r\n- Vuln information mentioned by Kay as possible content inserted into SCITT\r\n - This is a goal of ours with our CVE Binary Tool engagement\r\n - We also could encode SBOMs from the systems that built them, we could patch sigstore to insert into a SCITT ledger",
"editedAt": "2022-07-25T15:48:06Z"
},
{
"diff": "## 2022-07-25 Supply Chain Integrity, Transparency and Trust (SCITT)\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n- Links\r\n - https://github.com/intel/dffml/discussions/1406#discussioncomment-3181956\r\n - https://github.com/intel/dffml/discussions/1406#discussioncomment-3223247\r\n- Folks will be in Philly on Thursday for meeting\r\n - There is a remote link (see mailing list?) for the Thursday meeting\r\n- Global warming discussions\r\n- Others at RATs meeting or busy with other IETF activities\r\n- Introductions\r\n - Kelvin Cusack\r\n - Filling in for John from his org\r\n - John Andersen\r\n - Connecting dots between this and OpenSSF\r\n - Yogesh was excited to see someone from Intel\r\n - Intel is involved in RATs but not as much here\r\n - Kiran Karunakaran\r\n - Microsoft\r\n - On Kay Williams's team\r\n - Will likely lead this meeting in the future\r\n- Upcoming Birds of Feather (BoF)\r\n - You need to register here: https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit?pli=1#heading=h.214jg0n2xjhp\r\n - There is registration for remote\r\n - Problem Statement currently scoped around software\r\n - We went back to a more scoped problem statement is that we want to form a formal working group in the IETF for SCITT.\r\n - In order to form it we have to have stuffiest people show interest in the problem space\r\n - Need problem space and charter so that it's scoped enough that the leadership is confident that the group can make progress\r\n - The early design proposal for the SCITT transparency service is that the service is content agnostic\r\n - any kind of metadata could be provided and retrieved\r\n - SBOMs, Software Test Cases, Hardware BOMs, Test results on other types of products\r\n - Electronic ballots (Ray, see mailing list), oil, gas, physical goods (U.S. govt.)\r\n - In order to gain confidence from the leadership at IETF to form the WG we felt it was critical to narrow the scope for now to software\r\n - Leadership thought scope was too big at first\r\n- Thoughts around scope\r\n - Charter is focused on software, an attainable goal\r\n - Once we have a WG we can later broaden the scope via re-chartering\r\n- We will design something that works for hardware and for software\r\n - We are hanging software window curtains but we are looking at everything\r\n- Software systems interact with everything else\r\n - Dick Brooks (REA), any manifest could be signed and processes with SCITT\r\n - It's just metadata, what was issued, how it was issued\r\n - @dffml: We are encoding system contexts, Alice, into the chain, one place where she will live.\r\n- Opens\r\n - Open Policy Agent (mentioned in meeting minutes doc future topics)\r\n - What are your plans / thoughts around Open Policy Agent and encoding policies into SCITT?\r\n - Dick Brooks: Metions HR4081\r\n - On topic for what we are \r\n - Talked about attestations for devices containing a camera and microphone and is connected to the internet\r\n - There will need to be an attestation from the device \r\n - Dick submitted to local rep to include software attestations as well\r\n - https://www.congress.gov/bill/117th-congress/house-bill/4081\r\nhttps://www.congress.gov/bill/117th-congress/house-bill/4081/text\r\n- Producers and other parties provide content into the system, attestations, claims, recorded into SCITT ledger\r\n- Policy can be used in two places\r\n - Policy for what can be put onto register\r\n - Some registries might constrain themselves for what types of data they allow\r\n - Policy for someone evaluating the contents of the registry to make discussions for fitness of use\r\n - REGO also considered as a policy language\r\n - Perhaps decide on multi\r\n - This policy discussion will happen in this WG for now, then maybe a sub working group\r\n- Dick said that you an contact your congress person to make sure we can see an impact because there is\r\n - Amendment for smartphone apps to provide a trust score\r\n - Tie in with OpenSSF metrics database to grab the security of repos involved\r\n - Dick\r\n - Proposed amendment I mentioned for HR 4081:\r\n - \"Require smart phone app stores to include a software supply chain trust score for each app\". This gives consumers the ability to check trustworthiness before installing an app,\r\n- Ray: thinkg about attestations is different than the trnsparency ledgre\r\n - Thinks it's a lot to bite off to do both\r\n - Are there IoT folks that might have more attestation experiance we could tap into?\r\n - Is there a sub-working group focused on device attetations (in response to HR4081)\r\n - Devicde attestations could be recorded tin the trnasparency ledgre\r\n - The device atteatestion itself it is diffciult to understand if it's really \r\n - TCG DICE WG is a target point of engagement (UCAN, CBOR, DID).\r\n - https://trustedcomputinggroup.org/work-groups/dice-architectures/\r\n - https://www.trustedcomputinggroup.org/wp-content/uploads/Device-Identifier-Composition-Engine-Rev69_Public-Review.pdf\r\n - Looking at hardware actively attesting\r\n - Microsoft has an Open Source implementation on GitHub\r\n - This attestation stuff starts to look at real life commerce, Ray thinks it's important to \r\n- Joshua Lock\r\n - On software attestations, I have been working on a page for the SLSA website to describe the model we're working with. I can share to the SCITT list once the change is merged.\r\n - IIRC the SCITT draft standards refer to a 'software attestation\" as a \"claim\", to disambiguate from RATS & TCG attestations\r\n- Remote Attestation and Device Attestation\r\n - Embraced COSE and CBOR\r\n - Also in SCITT\r\n - Hopefully we converge on underlying formats for both in-toto style and remote attestation style attestations\r\n- There are also NIST attestations\r\n- Vuln information mentioned by Kay as possible content inserted into SCITT\r\n - This is a goal of ours with our CVE Binary Tool engagement\r\n - We also could encode SBOMs from the systems that built them, we could patch sigstore to insert into a SCITT ledger",
"editedAt": "2022-07-25T15:46:42Z"
},
{
"diff": "## 2022-07-25 Supply Chain Integrity, Transparency and Trust (SCITT)\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n- Links\r\n - https://github.com/intel/dffml/discussions/1406#discussioncomment-3181956\r\n - https://github.com/intel/dffml/discussions/1406#discussioncomment-3223247\r\n- Folks will be in Philly on Thursday for meeting\r\n - There is a remote link (see mailing list?) for the Thursday meeting\r\n- Global warming discussions\r\n- Others at RATs meeting or busy with other IETF activities\r\n- Introductions\r\n - Kelvin Cusack\r\n - Filling in for John from his org\r\n - John Andersen\r\n - Connecting dots between this and OpenSSF\r\n - Yogesh was excited to see someone from Intel\r\n - Intel is involved in RATs but not as much here\r\n - Kiran Karunakaran\r\n - Microsoft\r\n - On Kay Williams's team\r\n - Will likely lead this meeting in the future\r\n- Upcoming Birds of Feather (BoF)\r\n - You need to register here: https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit?pli=1#heading=h.214jg0n2xjhp\r\n - There is registration for remote\r\n - Problem Statement currently scoped around software\r\n - We went back to a more scoped problem statement is that we want to form a formal working group in the IETF for SCITT.\r\n - In order to form it we have to have stuffiest people show interest in the problem space\r\n - Need problem space and charter so that it's scoped enough that the leadership is confident that the group can make progress\r\n - The early design proposal for the SCITT transparency service is that the service is content agnostic\r\n - any kind of metadata could be provided and retrieved\r\n - SBOMs, Software Test Cases, Hardware BOMs, Test results on other types of products\r\n - Electronic ballots (Ray, see mailing list), oil, gas, physical goods (U.S. govt.)\r\n - In order to gain confidence from the leadership at IETF to form the WG we felt it was critical to narrow the scope for now to software\r\n - Leadership thought scope was too big at first\r\n- Thoughts around scope\r\n - Charter is focused on software, an attainable goal\r\n - Once we have a WG we can later broaden the scope via re-chartering\r\n- We will design something that works for hardware and for software\r\n - We are hanging software window curtains but we are looking at everything\r\n- Software systems interact with everything else\r\n - Dick Brooks (REA), any manifest could be signed and processes with SCITT\r\n - It's just metadata, what was issued, how it was issued\r\n - @dffml: We are encoding system contexts, Alice, into the chain, one place where she will live.\r\n- Opens\r\n - Open Policy Agent (mentioned in meeting minutes doc future topics)\r\n - What are your plans / thoughts around Open Policy Agent and encoding policies into SCITT?\r\n - Dick Brooks: Metions HR4081\r\n - On topic for what we are \r\n - Talked about attestations for devices containing a camera and microphone and is connected to the internet\r\n - There will need to be an attestation from the device \r\n - Dick submitted to local rep to include software attestations as well\r\n - https://www.congress.gov/bill/117th-congress/house-bill/4081\r\nhttps://www.congress.gov/bill/117th-congress/house-bill/4081/text\r\n- Producers and other parties provide content into the system, attestations, claims, recorded into SCITT ledger\r\n- Policy can be used in two places\r\n - Policy for what can be put onto register\r\n - Some registries might constrain themselves for what types of data they allow\r\n - Policy for someone evaluating the contents of the registry to make discussions for fitness of use\r\n - REGO also considered as a policy language\r\n - Perhaps decide on multi\r\n - This policy discussion will happen in this WG for now, then maybe a sub working group\r\n- Dick said that you an contact your congress person to make sure we can see an impact because there is\r\n - Amendment for smartphone apps to provide a trust score\r\n - Tie in with OpenSSF metrics database to grab the security of repos involved\r\n - Dick\r\n - Proposed amendment I mentioned for HR 4081:\r\n - \"Require smart phone app stores to include a software supply chain trust score for each app\". This gives consumers the ability to check trustworthiness before installing an app,\r\n- Ray: thinkg about attestations is different than the trnsparency ledgre\r\n - Thinks it's a lot to bite off to do both\r\n - Are there IoT folks that might have more attestation experiance we could tap into?\r\n - Is there a sub-working group focused on device attetations (in response to HR4081)\r\n - Devicde attestations could be recorded tin the trnasparency ledgre\r\n - The device atteatestion itself it is diffciult to understand if it's really \r\n - TCG DICE WG is a target point of engagement (UCAN, CBOR, DID).\r\n - https://trustedcomputinggroup.org/work-groups/dice-architectures/\r\n - https://www.trustedcomputinggroup.org/wp-content/uploads/Device-Identifier-Composition-Engine-Rev69_Public-Review.pdf\r\n - Looking at hardware actively attesting\r\n - Microsoft has an Open Source implementation on GitHub\r\n - This attestation stuff starts to look at real life commerce, Ray thinks it's important to \r\n- Joshua Lock\r\n - On software attestations, I have been working on a page for the SLSA website to describe the model we're working with. I can share to the SCITT list once the change is merged.\r\n - IIRC the SCITT draft standards refer to a 'software attestation\" as a \"claim\", to disambiguate from RATS & TCG attestations\r\n- Remote Attestation and Device Attestation\r\n - Embraced COSE and CBOR\r\n - Also in SCITT\r\n - Hopefully we converge on underlying formats for both in-toto style and remote attestation style attestations\r\n- There are also NIST attestations",
"editedAt": "2022-07-25T15:45:19Z"
},
{
"diff": "## 2022-07-25 Supply Chain Integrity, Transparency and Trust (SCITT)\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n- Links\r\n - https://github.com/intel/dffml/discussions/1406#discussioncomment-3181956\r\n - https://github.com/intel/dffml/discussions/1406#discussioncomment-3223247\r\n- Folks will be in Philly on Thursday for meeting\r\n - There is a remote link (see mailing list?) for the Thursday meeting\r\n- Global warming discussions\r\n- Others at RATs meeting or busy with other IETF activities\r\n- Introductions\r\n - Kelvin Cusack\r\n - Filling in for John from his org\r\n - John Andersen\r\n - Connecting dots between this and OpenSSF\r\n - Yogesh was excited to see someone from Intel\r\n - Intel is involved in RATs but not as much here\r\n - Kiran Karunakaran\r\n - Microsoft\r\n - On Kay Williams's team\r\n - Will likely lead this meeting in the future\r\n- Upcoming Birds of Feather (BoF)\r\n - You need to register here: https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit?pli=1#heading=h.214jg0n2xjhp\r\n - There is registration for remote\r\n - Problem Statement currently scoped around software\r\n - We went back to a more scoped problem statement is that we want to form a formal working group in the IETF for SCITT.\r\n - In order to form it we have to have stuffiest people show interest in the problem space\r\n - Need problem space and charter so that it's scoped enough that the leadership is confident that the group can make progress\r\n - The early design proposal for the SCITT transparency service is that the service is content agnostic\r\n - any kind of metadata could be provided and retrieved\r\n - SBOMs, Software Test Cases, Hardware BOMs, Test results on other types of products\r\n - Electronic ballots (Ray, see mailing list), oil, gas, physical goods (U.S. govt.)\r\n - In order to gain confidence from the leadership at IETF to form the WG we felt it was critical to narrow the scope for now to software\r\n - Leadership thought scope was too big at first\r\n- Thoughts around scope\r\n - Charter is focused on software, an attainable goal\r\n - Once we have a WG we can later broaden the scope via re-chartering\r\n- We will design something that works for hardware and for software\r\n - We are hanging software window curtains but we are looking at everything\r\n- Software systems interact with everything else\r\n - Dick Brooks (REA), any manifest could be signed and processes with SCITT\r\n - It's just metadata, what was issued, how it was issued\r\n - @dffml: We are encoding system contexts, Alice, into the chain, one place where she will live.\r\n- Opens\r\n - Open Policy Agent (mentioned in meeting minutes doc future topics)\r\n - What are your plans / thoughts around Open Policy Agent and encoding policies into SCITT?\r\n - Dick Brooks: Metions HR4081\r\n - On topic for what we are \r\n - Talked about attestations for devices containing a camera and microphone and is connected to the internet\r\n - There will need to be an attestation from the device \r\n - Dick submitted to local rep to include software attestations as well\r\n - https://www.congress.gov/bill/117th-congress/house-bill/4081\r\nhttps://www.congress.gov/bill/117th-congress/house-bill/4081/text\r\n- Producers and other parties provide content into the system, attestations, claims, recorded into SCITT ledger\r\n- Policy can be used in two places\r\n - Policy for what can be put onto register\r\n - Some registries might constrain themselves for what types of data they allow\r\n - Policy for someone evaluating the contents of the registry to make discussions for fitness of use\r\n - REGO also considered as a policy language\r\n - Perhaps decide on multi\r\n - This policy discussion will happen in this WG for now, then maybe a sub working group\r\n- Dick said that you an contact your congress person to make sure we can see an impact because there is\r\n - Amendment for smartphone apps to provide a trust score\r\n - Tie in with OpenSSF metrics database to grab the security of repos involved\r\n - Dick\r\n - Proposed amendment I mentioned for HR 4081:\r\n - \"Require smart phone app stores to include a software supply chain trust score for each app\". This gives consumers the ability to check trustworthiness before installing an app,\r\n- Ray: thinkg about attestations is different than the trnsparency ledgre\r\n - Thinks it's a lot to bite off to do both\r\n - Are there IoT folks that might have more attestation experiance we could tap into?\r\n - Is there a sub-working group focused on device attetations (in response to HR4081)\r\n - Devicde attestations could be recorded tin the trnasparency ledgre\r\n - The device atteatestion itself it is diffciult to understand if it's really \r\n - TCG DICE WG is a target point of engagement (UCAN, CBOR, DID).\r\n - https://trustedcomputinggroup.org/work-groups/dice-architectures/\r\n - https://www.trustedcomputinggroup.org/wp-content/uploads/Device-Identifier-Composition-Engine-Rev69_Public-Review.pdf\r\n - Looking at hardware actively attesting\r\n - Microsoft has an Open Source implementation on GitHub\r\n - This attestation stuff starts to look at real life commerce, Ray thinks it's important to \r\n- Joshua Lock\r\n - On software attestations, I have been working on a page for the SLSA website to describe the model we're working with. I can share to the SCITT list once the change is merged.\r\n - IIRC the SCITT draft standards refer to a 'software attestation\" as a \"claim\", to disambiguate from RATS & TCG attestations\r\n- Remote Attestation and Device Attestation\r\n - Embraced COSE and CBOR\r\n - Also in SCITT\r\n - Hopefully we converge on underlying formats for both in-toto style and remote attestation style attestations\r\n- There are also NIST attestations\r\n- ",
"editedAt": "2022-07-25T15:42:57Z"
},
{
"diff": "## 2022-07-25 Supply Chain Integrity, Transparency and Trust (SCITT)\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n- Links\r\n - https://github.com/intel/dffml/discussions/1406#discussioncomment-3181956\r\n - https://github.com/intel/dffml/discussions/1406#discussioncomment-3223247\r\n- Folks will be in Philly on Thursday for meeting\r\n - There is a remote link (see mailing list?) for the Thursday meeting\r\n- Global warming discussions\r\n- Others at RATs meeting or busy with other IETF activities\r\n- Introductions\r\n - Kelvin Cusack\r\n - Filling in for John from his org\r\n - John Andersen\r\n - Connecting dots between this and OpenSSF\r\n - Yogesh was excited to see someone from Intel\r\n - Intel is involved in RATs but not as much here\r\n - Kiran Karunakaran\r\n - Microsoft\r\n - On Kay Williams's team\r\n - Will likely lead this meeting in the future\r\n- Upcoming Birds of Feather (BoF)\r\n - You need to register here: https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit?pli=1#heading=h.214jg0n2xjhp\r\n - There is registration for remote\r\n - Problem Statement currently scoped around software\r\n - We went back to a more scoped problem statement is that we want to form a formal working group in the IETF for SCITT.\r\n - In order to form it we have to have stuffiest people show interest in the problem space\r\n - Need problem space and charter so that it's scoped enough that the leadership is confident that the group can make progress\r\n - The early design proposal for the SCITT transparency service is that the service is content agnostic\r\n - any kind of metadata could be provided and retrieved\r\n - SBOMs, Software Test Cases, Hardware BOMs, Test results on other types of products\r\n - Electronic ballots (Ray, see mailing list), oil, gas, physical goods (U.S. govt.)\r\n - In order to gain confidence from the leadership at IETF to form the WG we felt it was critical to narrow the scope for now to software\r\n - Leadership thought scope was too big at first\r\n- Thoughts around scope\r\n - Charter is focused on software, an attainable goal\r\n - Once we have a WG we can later broaden the scope via re-chartering\r\n- We will design something that works for hardware and for software\r\n - We are hanging software window curtains but we are looking at everything\r\n- Software systems interact with everything else\r\n - Dick Brooks (REA), any manifest could be signed and processes with SCITT\r\n - It's just metadata, what was issued, how it was issued\r\n - @dffml: We are encoding system contexts, Alice, into the chain, one place where she will live.\r\n- Opens\r\n - Open Policy Agent (mentioned in meeting minutes doc future topics)\r\n - What are your plans / thoughts around Open Policy Agent and encoding policies into SCITT?\r\n - Dick Brooks: Metions HR4081\r\n - On topic for what we are \r\n - Talked about attestations for devices containing a camera and microphone and is connected to the internet\r\n - There will need to be an attestation from the device \r\n - Dick submitted to local rep to include software attestations as well\r\n - https://www.congress.gov/bill/117th-congress/house-bill/4081\r\nhttps://www.congress.gov/bill/117th-congress/house-bill/4081/text\r\n- Producers and other parties provide content into the system, attestations, claims, recorded into SCITT ledger\r\n- Policy can be used in two places\r\n - Policy for what can be put onto register\r\n - Some registries might constrain themselves for what types of data they allow\r\n - Policy for someone evaluating the contents of the registry to make discussions for fitness of use\r\n - REGO also considered as a policy language\r\n - Perhaps decide on multi\r\n - This policy discussion will happen in this WG for now, then maybe a sub working group\r\n- Dick said that you an contact your congress person to make sure we can see an impact because there is\r\n - Amendment for smartphone apps to provide a trust score\r\n - Tie in with OpenSSF metrics database to grab the security of repos involved\r\n - Dick\r\n - Proposed amendment I mentioned for HR 4081:\r\n - \"Require smart phone app stores to include a software supply chain trust score for each app\". This gives consumers the ability to check trustworthiness before installing an app,\r\n- Ray: thinkg about attestations is different than the trnsparency ledgre\r\n - Thinks it's a lot to bite off to do both\r\n - Are there IoT folks that might have more attestation experiance we could tap into?\r\n - Is there a sub-working group focused on device attetations (in response to HR4081)\r\n - Devicde attestations could be recorded tin the trnasparency ledgre\r\n - The device atteatestion itself it is diffciult to understand if it's really \r\n - DICE WG is a target point of engagement (UCAN, CBOR, DID).\r\n - This attestation stuff starts to look at real life commerce, Ray thinks it's importanta to ",
"editedAt": "2022-07-25T15:35:59Z"
},
{
"diff": "## 2022-07-25 Supply Chain Integrity, Transparency and Trust (SCITT)\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n- Links\r\n - https://github.com/intel/dffml/discussions/1406#discussioncomment-3181956\r\n - https://github.com/intel/dffml/discussions/1406#discussioncomment-3223247\r\n- Folks will be in Philly on Thursday for meeting\r\n - There is a remote link (see mailing list?) for the Thursday meeting\r\n- Global warming discussions\r\n- Others at RATs meeting or busy with other IETF activities\r\n- Introductions\r\n - Kelvin Cusack\r\n - Filling in for John from his org\r\n - John Andersen\r\n - Connecting dots between this and OpenSSF\r\n - Yogesh was excited to see someone from Intel\r\n - Intel is involved in RATs but not as much here\r\n - Kiran Karunakaran\r\n - Microsoft\r\n - On Kay Williams's team\r\n - Will likely lead this meeting in the future\r\n- Upcoming Birds of Feather (BoF)\r\n - You need to register here: https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit?pli=1#heading=h.214jg0n2xjhp\r\n - Problem Statement currently scoped around software\r\n - We went back to a more scoped problem statemetn is that we want to form a formal working group in the IETF for SCITT.\r\n - In order to form it we have to have suffient people show interest in the problem space\r\n - Need problem space and charter so that it's scoped enough that the leadership is confident that the group can make progress\r\n - The early design propsal for the SCITT transparency service is that the service is content agnositic\r\n - any kind of metadata could be provided and retrieved\r\n - SBOMs, Software Test Cases, Hardware BOMs, Test results on other types of products\r\n - Electronic ballots (Ray, see mailing list), oil, gas, physical goods (U.S. govt.)\r\n - In order to gain confidence from the leadership at IETF to form the WG we felt it was critical to narrow the scope for now to software\r\n - Leadership thought scope was too big at first\r\n- Thoughts around scope\r\n - Charter is focused on software, an attainable goal\r\n - Once we have a WG we can later broaden the scope via re-chartering\r\n- We will design something that works for hardware and for software\r\n - We are hanging software window curtains but we are looking at everything\r\n- Software systems interact with everything else\r\n - Dick Brooks, any manifest could be signed and processes with SCITT\r\n - It's just metadata, what was issued, how it was issued\r\n - @dffml: We are encoding system contexts, Alice, into the chain, one place where she will live.",
"editedAt": "2022-07-25T15:21:44Z"
},
{
"diff": "## 2022-07-25 Supply Chain Integrity, Transparency and Trust (SCITT)\r\n\r\n- Links\r\n - https://github.com/intel/dffml/discussions/1406#discussioncomment-3181956\r\n - https://github.com/intel/dffml/discussions/1406#discussioncomment-3223247\r\n- Folks will be in Philly on Thursday for meeting\r\n - There is a remote link (see mailing list?) for the Thursday meeting\r\n- Global warming discussions\r\n- Others at RATs meeting or busy with other IETF activities\r\n- Introductions\r\n - Kelvin Cusack\r\n - Filling in for John from his org\r\n - John Andersen\r\n - Connecting dots between this and OpenSSF\r\n - Yogesh was excited to see someone from Intel\r\n - Intel is involved in RATs but not as much here\r\n - Kiran Karunakaran\r\n - Microsoft\r\n - On Kay Williams's team\r\n - Will likely lead this meeting in the future\r\n- Upcoming Birds of Feather (BoF)\r\n - You need to register here: https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit?pli=1#heading=h.214jg0n2xjhp\r\n - Problem Statement currently scoped around software\r\n - We went back to a more scoped problem statemetn is that we want to form a formal working group in the IETF for SCITT.\r\n - In order to form it we have to have suffient people show interest in the problem space\r\n - Need problem space and charter so that it's scoped enough that the leadership is confident that the group can make progress\r\n - The early design propsal for the SCITT transparency service is that the service is content agnositic\r\n - any kind of metadata could be provided and retrieved\r\n - SBOMs, Software Test Cases, Hardware BOMs, Test results on other types of products\r\n - Electronic ballots (Ray, see mailing list), oil, gas, physical goods (U.S. govt.)\r\n - In order to gain confidence from the leadership at IETF to form the WG we felt it was critical to narrow the scope for now to software\r\n - Leadership thought scope was too big at first",
"editedAt": "2022-07-25T15:17:51Z"
},
{
"diff": "## 2022-07-25 Supply Chain Integrity, Transparency and Trust (SCITT)\r\n\r\n- Folks will be in Philly on Thursday for meeting\r\n - There is a remote link (see mailing list?) for the Thursday meeting\r\n- ",
"editedAt": "2022-07-25T15:06:46Z"
},
{
"diff": "## 2022-07-25 Supply Chain Integrity, Transparency and Trust (SCITT)\r\n\r\n- Folks will be in Philly on Thursday",
"editedAt": "2022-07-25T15:03:43Z"
}
]
}
}
]
}
},
{
"body": "# 2022-07-26 Engineering Logs\r\n\r\n- TODO\r\n - [x] @aliceoa, @pdxjohnny: Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] @pdxjohnny: Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] @dffml: Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.",
"createdAt": "2022-07-26T13:09:49Z",
"userContentEdits": {
"pageInfo": {
"endCursor": null,
"hasNextPage": false
},
"totalCount": 0,
"nodes": []
},
"replies": {
"pageInfo": {
"endCursor": "Y3Vyc29yOnYyOpK5MjAyMi0wNy0yNlQwNjoxNToxMS0wNzowMM4AMU19",
"hasNextPage": false
},
"totalCount": 1,
"nodes": [
{
"body": "## 2022-07-26 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n\r\n![Software Analysis Trinity drawio](https://user-images.githubusercontent.com/5950433/181014158-4187950e-d0a4-4d7d-973b-dc414320e64f.svg)\r\n\r\n- Update current overlays to have lock taken on `AliceGitRepo` and then subflows with `ReadmeGitRepo` and `ContributingGitRepo`.\r\n - This way the parent flow locks and they don't have to worry about loosing the lock between operations.\r\n\r\n```console\r\n$ git grep -C 22 run_custom\r\nalice/please/contribute/recommended_community_standards/cli.py- async def cli_run_on_repo(self, repo: \"CLIRunOnRepo\"):\r\nalice/please/contribute/recommended_community_standards/cli.py- # TODO Similar to Expand being an alias of Union\r\nalice/please/contribute/recommended_community_standards/cli.py- #\r\nalice/please/contribute/recommended_community_standards/cli.py- # async def cli_run_on_repo(self, repo: 'CLIRunOnRepo') -> SystemContext[StringInputSetContext[AliceGitRepo]]:\r\nalice/please/contribute/recommended_community_standards/cli.py- # return repo\r\nalice/please/contribute/recommended_community_standards/cli.py- #\r\nalice/please/contribute/recommended_community_standards/cli.py- # Or ideally at class scope\r\nalice/please/contribute/recommended_community_standards/cli.py- #\r\nalice/please/contribute/recommended_community_standards/cli.py- # 'CLIRunOnRepo' -> SystemContext[StringInputSetContext[AliceGitRepo]]\r\nalice/please/contribute/recommended_community_standards/cli.py- async with self.parent.__class__(self.parent.config) as custom_run_dataflow:\r\nalice/please/contribute/recommended_community_standards/cli.py- async with custom_run_dataflow(\r\nalice/please/contribute/recommended_community_standards/cli.py- self.ctx, self.octx\r\nalice/please/contribute/recommended_community_standards/cli.py- ) as custom_run_dataflow_ctx:\r\nalice/please/contribute/recommended_community_standards/cli.py- # This is the type cast\r\nalice/please/contribute/recommended_community_standards/cli.py- custom_run_dataflow.op = self.parent.op._replace(\r\nalice/please/contribute/recommended_community_standards/cli.py- inputs={\r\nalice/please/contribute/recommended_community_standards/cli.py- \"repo\": AlicePleaseContributeRecommendedCommunityStandards.RepoString\r\nalice/please/contribute/recommended_community_standards/cli.py- }\r\nalice/please/contribute/recommended_community_standards/cli.py- )\r\nalice/please/contribute/recommended_community_standards/cli.py- # Set the dataflow to be the same flow\r\nalice/please/contribute/recommended_community_standards/cli.py- # TODO Reuse ictx? Is that applicable?\r\nalice/please/contribute/recommended_community_standards/cli.py- custom_run_dataflow.config.dataflow = self.octx.config.dataflow\r\nalice/please/contribute/recommended_community_standards/cli.py: await dffml.run_dataflow.run_custom(\r\nalice/please/contribute/recommended_community_standards/cli.py- custom_run_dataflow_ctx, {\"repo\": repo},\r\nalice/please/contribute/recommended_community_standards/cli.py- )\r\n```",
"createdAt": "2022-07-26T13:15:11Z",
"userContentEdits": {
"pageInfo": {
"endCursor": "Y3Vyc29yOnYyOpHOABO0xQ==",
"hasNextPage": false
},
"totalCount": 5,
"nodes": [
{
"diff": "## 2022-07-26 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n\r\n![Software Analysis Trinity drawio](https://user-images.githubusercontent.com/5950433/181014158-4187950e-d0a4-4d7d-973b-dc414320e64f.svg)\r\n\r\n- Update current overlays to have lock taken on `AliceGitRepo` and then subflows with `ReadmeGitRepo` and `ContributingGitRepo`.\r\n - This way the parent flow locks and they don't have to worry about loosing the lock between operations.\r\n\r\n```console\r\n$ git grep -C 22 run_custom\r\nalice/please/contribute/recommended_community_standards/cli.py- async def cli_run_on_repo(self, repo: \"CLIRunOnRepo\"):\r\nalice/please/contribute/recommended_community_standards/cli.py- # TODO Similar to Expand being an alias of Union\r\nalice/please/contribute/recommended_community_standards/cli.py- #\r\nalice/please/contribute/recommended_community_standards/cli.py- # async def cli_run_on_repo(self, repo: 'CLIRunOnRepo') -> SystemContext[StringInputSetContext[AliceGitRepo]]:\r\nalice/please/contribute/recommended_community_standards/cli.py- # return repo\r\nalice/please/contribute/recommended_community_standards/cli.py- #\r\nalice/please/contribute/recommended_community_standards/cli.py- # Or ideally at class scope\r\nalice/please/contribute/recommended_community_standards/cli.py- #\r\nalice/please/contribute/recommended_community_standards/cli.py- # 'CLIRunOnRepo' -> SystemContext[StringInputSetContext[AliceGitRepo]]\r\nalice/please/contribute/recommended_community_standards/cli.py- async with self.parent.__class__(self.parent.config) as custom_run_dataflow:\r\nalice/please/contribute/recommended_community_standards/cli.py- async with custom_run_dataflow(\r\nalice/please/contribute/recommended_community_standards/cli.py- self.ctx, self.octx\r\nalice/please/contribute/recommended_community_standards/cli.py- ) as custom_run_dataflow_ctx:\r\nalice/please/contribute/recommended_community_standards/cli.py- # This is the type cast\r\nalice/please/contribute/recommended_community_standards/cli.py- custom_run_dataflow.op = self.parent.op._replace(\r\nalice/please/contribute/recommended_community_standards/cli.py- inputs={\r\nalice/please/contribute/recommended_community_standards/cli.py- \"repo\": AlicePleaseContributeRecommendedCommunityStandards.RepoString\r\nalice/please/contribute/recommended_community_standards/cli.py- }\r\nalice/please/contribute/recommended_community_standards/cli.py- )\r\nalice/please/contribute/recommended_community_standards/cli.py- # Set the dataflow to be the same flow\r\nalice/please/contribute/recommended_community_standards/cli.py- # TODO Reuse ictx? Is that applicable?\r\nalice/please/contribute/recommended_community_standards/cli.py- custom_run_dataflow.config.dataflow = self.octx.config.dataflow\r\nalice/please/contribute/recommended_community_standards/cli.py: await dffml.run_dataflow.run_custom(\r\nalice/please/contribute/recommended_community_standards/cli.py- custom_run_dataflow_ctx, {\"repo\": repo},\r\nalice/please/contribute/recommended_community_standards/cli.py- )\r\n```",
"editedAt": "2022-07-26T16:42:41Z"
},
{
"diff": "## 2022-07-26 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n\r\n![Software Analysis Trinity drawio](https://user-images.githubusercontent.com/5950433/181014158-4187950e-d0a4-4d7d-973b-dc414320e64f.svg)\r\n\r\n- Update current overlays to have lock taken on `AliceGitRepo` and then subflows with `ReadmeGitRepo` and `ContributingGitRepo`.\r\n - This way the parent flow locks and they don't have to worry about loosing the lock between operations.\r\n\r\n```console\r\n$ git grep -C 22 run_custom\r\nalice/please/contribute/recommended_community_standards/cli.py- async def cli_run_on_repo(self, repo: \"CLIRunOnRepo\"):\r\nalice/please/contribute/recommended_community_standards/cli.py- # TODO Similar to Expand being an alias of Union\r\nalice/please/contribute/recommended_community_standards/cli.py- #\r\nalice/please/contribute/recommended_community_standards/cli.py- # async def cli_run_on_repo(self, repo: 'CLIRunOnRepo') -> SystemContext[StringInputSetContext[AliceGitRepo]]:\r\nalice/please/contribute/recommended_community_standards/cli.py- # return repo\r\nalice/please/contribute/recommended_community_standards/cli.py- #\r\nalice/please/contribute/recommended_community_standards/cli.py- # Or ideally at class scope\r\nalice/please/contribute/recommended_community_standards/cli.py- #\r\nalice/please/contribute/recommended_community_standards/cli.py- # 'CLIRunOnRepo' -> SystemContext[StringInputSetContext[AliceGitRepo]]\r\nalice/please/contribute/recommended_community_standards/cli.py- async with self.parent.__class__(self.parent.config) as custom_run_dataflow:\r\nalice/please/contribute/recommended_community_standards/cli.py- async with custom_run_dataflow(\r\nalice/please/contribute/recommended_community_standards/cli.py- self.ctx, self.octx\r\nalice/please/contribute/recommended_community_standards/cli.py- ) as custom_run_dataflow_ctx:\r\nalice/please/contribute/recommended_community_standards/cli.py- # This is the type cast\r\nalice/please/contribute/recommended_community_standards/cli.py- custom_run_dataflow.op = self.parent.op._replace(\r\nalice/please/contribute/recommended_community_standards/cli.py- inputs={\r\nalice/please/contribute/recommended_community_standards/cli.py- \"repo\": AlicePleaseContributeRecommendedCommunityStandards.RepoString\r\nalice/please/contribute/recommended_community_standards/cli.py- }\r\nalice/please/contribute/recommended_community_standards/cli.py- )\r\nalice/please/contribute/recommended_community_standards/cli.py- # Set the dataflow to be the same flow\r\nalice/please/contribute/recommended_community_standards/cli.py- # TODO Reuse ictx? Is that applicable?\r\nalice/please/contribute/recommended_community_standards/cli.py- custom_run_dataflow.config.dataflow = self.octx.config.dataflow\r\nalice/please/contribute/recommended_community_standards/cli.py: await dffml.run_dataflow.run_custom(\r\nalice/please/contribute/recommended_community_standards/cli.py- custom_run_dataflow_ctx, {\"repo\": repo},\r\nalice/please/contribute/recommended_community_standards/cli.py- )\r\n```",
"editedAt": "2022-07-26T16:42:19Z"
},
{
"diff": "## 2022-07-26 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n\r\n![Software Analysis Trinity drawio](https://user-images.githubusercontent.com/5950433/181014158-4187950e-d0a4-4d7d-973b-dc414320e64f.svg)\r\n\r\n- Update current overlays to have lock taken on `AliceGitRepo` and then subflows with `ReadmeGitRepo` and `ContributingGitRepo`.\r\n - This way the parent flow locks and they don't have to worry about loosing the lock between operations.\r\n\r\n```console\r\n$ git grep -C 22 run_custom\r\nalice/please/contribute/recommended_community_standards/cli.py- async def cli_run_on_repo(self, repo: \"CLIRunOnRepo\"):\r\nalice/please/contribute/recommended_community_standards/cli.py- # TODO Similar to Expand being an alias of Union\r\nalice/please/contribute/recommended_community_standards/cli.py- #\r\nalice/please/contribute/recommended_community_standards/cli.py- # async def cli_run_on_repo(self, repo: 'CLIRunOnRepo') -> SystemContext[StringInputSetContext[AliceGitRepo]]:\r\nalice/please/contribute/recommended_community_standards/cli.py- # return repo\r\nalice/please/contribute/recommended_community_standards/cli.py- #\r\nalice/please/contribute/recommended_community_standards/cli.py- # Or ideally at class scope\r\nalice/please/contribute/recommended_community_standards/cli.py- #\r\nalice/please/contribute/recommended_community_standards/cli.py- # 'CLIRunOnRepo' -> SystemContext[StringInputSetContext[AliceGitRepo]]\r\nalice/please/contribute/recommended_community_standards/cli.py- async with self.parent.__class__(self.parent.config) as custom_run_dataflow:\r\nalice/please/contribute/recommended_community_standards/cli.py- async with custom_run_dataflow(\r\nalice/please/contribute/recommended_community_standards/cli.py- self.ctx, self.octx\r\nalice/please/contribute/recommended_community_standards/cli.py- ) as custom_run_dataflow_ctx:\r\nalice/please/contribute/recommended_community_standards/cli.py- # This is the type cast\r\nalice/please/contribute/recommended_community_standards/cli.py- custom_run_dataflow.op = self.parent.op._replace(\r\nalice/please/contribute/recommended_community_standards/cli.py- inputs={\r\nalice/please/contribute/recommended_community_standards/cli.py- \"repo\": AlicePleaseContributeRecommendedCommunityStandards.RepoString\r\nalice/please/contribute/recommended_community_standards/cli.py- }\r\nalice/please/contribute/recommended_community_standards/cli.py- )\r\nalice/please/contribute/recommended_community_standards/cli.py- # Set the dataflow to be the same flow\r\nalice/please/contribute/recommended_community_standards/cli.py- # TODO Reuse ictx? Is that applicable?\r\nalice/please/contribute/recommended_community_standards/cli.py- custom_run_dataflow.config.dataflow = self.octx.config.dataflow\r\nalice/please/contribute/recommended_community_standards/cli.py: await dffml.run_dataflow.run_custom(\r\nalice/please/contribute/recommended_community_standards/cli.py- custom_run_dataflow_ctx, {\"repo\": repo},\r\nalice/please/contribute/recommended_community_standards/cli.py- )\r\n```",
"editedAt": "2022-07-26T16:01:57Z"
},
{
"diff": "## 2022-07-26 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n\r\n![Software Analysis Trinity drawio](https://user-images.githubusercontent.com/5950433/181014158-4187950e-d0a4-4d7d-973b-dc414320e64f.svg)",
"editedAt": "2022-07-26T13:15:26Z"
},
{
"diff": "## 2022-07-26 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] [SoftwareAnalysisTrinity.drawio.xml.txt](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n\r\n![Software Analysis Trinity drawio](https://user-images.githubusercontent.com/5950433/181014158-4187950e-d0a4-4d7d-973b-dc414320e64f.svg)",
"editedAt": "2022-07-26T13:15:11Z"
}
]
}
}
]
}
},
{
"body": "# 2022-07-27 Engineering Logs\r\n\r\n- References\r\n - kaniko coder k3d digitalocean\r\n - The following were issues with kind which might also effect us\r\n - https://github.com/GoogleContainerTools/kaniko/issues/2164\r\n - https://github.com/tektoncd/pipeline/commit/6542823c8330581fcfe6ba5a8ea7682a06510bcb\r\n - It doesn't look like kaniko currently supports multi context builds\r\n - Great example of communication and meeting procedures link to code\r\n - https://lists.spdx.org/g/Spdx-tech/message/4699",
"createdAt": "2022-07-27T20:43:28Z",
"userContentEdits": {
"pageInfo": {
"endCursor": "Y3Vyc29yOnYyOpHOABPIvA==",
"hasNextPage": false
},
"totalCount": 6,
"nodes": [
{
"diff": "# 2022-07-27 Engineering Logs\r\n\r\n- References\r\n - kaniko coder k3d digitalocean\r\n - The following were issues with kind which might also effect us\r\n - https://github.com/GoogleContainerTools/kaniko/issues/2164\r\n - https://github.com/tektoncd/pipeline/commit/6542823c8330581fcfe6ba5a8ea7682a06510bcb\r\n - It doesn't look like kaniko currently supports multi context builds\r\n - Great example of communication and meeting procedures link to code\r\n - https://lists.spdx.org/g/Spdx-tech/message/4699",
"editedAt": "2022-07-28T03:10:16Z"
},
{
"diff": "# 2022-07-27 Engineering Logs\r\n\r\n- References\r\n - kaniko coder k3d digitalocean\r\n - The following were issues with kind which might also effect us\r\n - https://github.com/GoogleContainerTools/kaniko/issues/2164\r\n - https://github.com/tektoncd/pipeline/commit/6542823c8330581fcfe6ba5a8ea7682a06510bcb\r\n - Great example of communication and meeting procedures link to code\r\n - https://lists.spdx.org/g/Spdx-tech/message/4699",
"editedAt": "2022-07-28T03:03:01Z"
},
{
"diff": "# 2022-07-27\r\n\r\n- References\r\n - kaniko coder k3d digitalocean\r\n - The following were issues with kind which might also effect us\r\n - https://github.com/GoogleContainerTools/kaniko/issues/2164\r\n - https://github.com/tektoncd/pipeline/commit/6542823c8330581fcfe6ba5a8ea7682a06510bcb\r\n - Great example of communication and meeting procedures link to code\r\n - https://lists.spdx.org/g/Spdx-tech/message/4699",
"editedAt": "2022-07-27T21:33:53Z"
},
{
"diff": "# 2022-07-27\r\n\r\n- References\r\n - kaniko coder k3d\r\n - The following were issues with kind which might also effect us\r\n - https://github.com/GoogleContainerTools/kaniko/issues/2164\r\n - https://github.com/tektoncd/pipeline/commit/6542823c8330581fcfe6ba5a8ea7682a06510bcb\r\n - Great example of communication and meeting procedures link to code\r\n - https://lists.spdx.org/g/Spdx-tech/message/4699",
"editedAt": "2022-07-27T21:33:37Z"
},
{
"diff": "\r\n- References\r\n - kaniko coder k3d\r\n - The following were issues with kind which might also effect us\r\n - https://github.com/GoogleContainerTools/kaniko/issues/2164\r\n - https://github.com/tektoncd/pipeline/commit/6542823c8330581fcfe6ba5a8ea7682a06510bcb\r\n - Great example of communication and meeting procedures link to code\r\n - https://lists.spdx.org/g/Spdx-tech/message/4699",
"editedAt": "2022-07-27T21:33:22Z"
},
{
"diff": "\r\n- References\r\n - Great example of communication and meeting procedures link to code\r\n - https://lists.spdx.org/g/Spdx-tech/message/4699",
"editedAt": "2022-07-27T20:43:28Z"
}
]
},
"replies": {
"pageInfo": {
"endCursor": "Y3Vyc29yOnYyOpK5MjAyMi0wNy0yN1QyMDozNzowMS0wNzowMM4AMdbI",
"hasNextPage": false
},
"totalCount": 1,
"nodes": [
{
"body": "## 2022-07-27 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n\r\n### Refactoring and Thinking About Locking of Repos for Contributions\r\n\r\n- Metadata\r\n - Date: 2022-07-27 20:30 UTC -7\r\n- Saving this diff which was some work on dynamic application of overlay\r\n so as to support fixup of the OpImp for `meta_issue_body()`'s inputs.\r\n - We are going to table this for now for time reasons, but if someone\r\n wants to pick it up before @pdxjohnny is back in September, please\r\n give it a go (create an issue).\r\n- Noticed that we have an issue with adding new files and locking. The current\r\n lock is on the `git_repository/GitRepoSpec`.\r\n - We then convert to `AliceGitRepo`, at which point anything take `AliceGitRepo`\r\n- `alice`\r\n - Goal: Display Alice and software analysis trinity\r\n - https://free-images.com/search/?q=alice%27s+adventures+in+wonderland&cat=st\r\n - https://free-images.com/display/de_alices_abenteuer_im_43.html\r\n - https://github.com/KhorSL/ASCII-ART\r\n - Completed in d067273f8571b6a56733336663aaebc3acb3a701\r\n\r\n![alice looking up](https://user-images.githubusercontent.com/5950433/181431145-18cfc8a7-28c8-486f-80f9-8b250e0b0943.png)\r\n\r\n```console\r\n$ python ascii_art.py /mnt/c/Users/Johnny/Downloads/alice-looking-up-white-background.png\r\n```\r\n\r\n```console\r\n$ alice\r\nusage: alice [-h] [-log LOG] {please,shouldi,threats} ...\r\n\r\n .,*&&888@@#&:,\r\n .:&::,...,:&#@@@#:.\r\n .o,. ..:8@@#@@+\r\n .8o+,+o*+*+,+:&#@@#8@@.\r\n &8&###@#&..*:8#@@#@#@@&+.\r\n ,@:#@##@@8,:&#@@@###@88@@.\r\n ,#@8&#@@@#o:#@@@@#8#@#8+&#.\r\n +8####@@@@###@@@888#@@@#oo#.\r\n .*8@###@@@@@@@@@#o*#@@#@@#8o@,\r\n +###@#o8&#@@##8::##@@@&&#@8#&+\r\n o@8&#&##::.,o&+88#&8##8*@@#@#,\r\n .##888&&oo#&o8###8&o##8##&####8,\r\n .&#@8&:+o+&@@@#8#&8:8@@@@@#8@@@oo+\r\n ,&&#@##oo+*:@###&#88,@@@@#@o&##&8#@o,.\r\n ,#&###@@8:*,#o&@@@@##:&#@###*.&o++o#@@#&+\r\n o8&8o8@#8+,,#.88#@#&@&&#@##++*&#o&&&#@@@@.\r\n *88:,#8&#,o+:+@&8#:8@8&8#@@&o++,*++*+:#@@*.\r\n .+#:o###@8o&8*@o&o8@o888@@@o+:o*&&,@#:&@@@,\r\n *+&@8&#@o#8+8*#+8#+88@@@@@@&@###8##@8:*,\r\n +o.@##@@@&88@*8@:8@@@@@@:.. ,8@:++.\r\n +&++8@@@@##@@@@@@@@@@@+ 88\r\n &. *@8@:+##o&888#@@@, .#+\r\n &. ,@+o,.::+*+*:&#&, ,@.\r\n &. .@8*,. ,*+++.+* :8+\r\n :+ .#@::. .8:.:** .8@@o,\r\n .o. #@+ :@,.&* .:@@@@@@8**.\r\n +&. :@o,+.*o,*, .*@@@@@@@@@@#o\r\n .*:&o. 8@o:,*:, .o@@#8&&@@@@#@@@*\r\n ,*:+:::o.*&8+,++ ,&@@#: * :@@88@@@#:.\r\n ,::**:o:.,&*+*8: *8@@##o *,.8@@#8#@#@#+\r\n *:+*&o8:. ,o,o:8@+o@@88:*@+ +: +#@#####8##&.\r\n ,:&::88&, .&:#o#@@@#,+&&*#&. .:,.&#@#88#####&,\r\n +::o+&8:. :##88@@@@:.:8o+&8&. .. +8###&8&##&88*\r\n .:*+*.8#: ,o*.+&@@#@8,,o8*+8##+ .+#8##8&&#8888:.\r\n ,:o., &#8. .:8*. .o, &#,*:8:+,&*:, .8@@#o&&##8:&#8.\r\n .*o.*,+o8#* +8&, .::. .88.+:8o: ,+:, ,o#@#8&o8##&#8+\r\n +o, .+,,o#8+,8@o**.,o*, :8o +*8#* +&, ,*o@@#@&8&oo8&:,\r\n oo*+,,,*8@#..&@8:**:oo+. +8#* *+#@:...oo+ .**:8@@@ooo&:&o##+\r\n ::+..,++#@,.:##o&o**,....oo#++#8#@:.,:8&:.....*&@@#:oo*&oo&#@*\r\n .+**:*8@o,+##&o:+,,,+,,o*8#,,8@#@:,,+*o*++,,,,+&#@8*8o88&::*. .,,,,,++,\r\n ..8@++#@#88:,,,.,,,:+#&,,#@@#:,,.,&o*,.+++*:#@8+:*+. ......,:+*&,,.....\r\n +:&8#@@##8&+,,,***@&,.8@@@*,,,.:o8&o&*o&o&o. .,.****::*:o*:o*o+,.\r\n ...,*:*o&&o*8@@&o8@@@8+,,+:&&:+,... ,++*&oo&8&&&oo#@##8#&8:.\r\n o@#@@@@#@@@@@@@,..... ..,,.+*::o#@##@##@#@#########@@@8:,.\r\n ,@##@@88#@@@@@8 .:***oo*#8###8#@#@#@#@####@#@###@@#8&#:\r\n 8+.,8+..,*o#@+ ,o+o88&88###@8#######@8#8#88#8#88##88#&\r\n *o *+ #8 . ,*o&#@##@@@@@@@@@######8#888&&oo:8:\r\n 8, ,& +@* .ooo&#@@@@@#@@@@@@####@##8#8##oo:o&:,\r\n +& &, .@#. .:8#@@@@@@@@@@##8#####8#o&&#8*:8&&8:\r\n o* ,o o@& +o#@@@@@@@@#o&o88:&+ooo&:*::o:o&**o.:*+\r\n .8. 8.,o#8 .+&#@@@@@@@@&o+,::*+*:+:, ,. ,.. .,. ,.\r\n 8. 8.,.&@:*:&@@@@@@@@8o+, ,.\r\n :@o:#,,o8&:o&@@@@#&:+.\r\n .@@@@@@@@@@@#8&o+,\r\n ,*:&#@#&o*,..\r\n\r\n /\\\r\n / \\\r\n Intent\r\n / \\\r\n / \\\r\n / \\\r\n / \\\r\n / \\\r\n / Alice is Here \\\r\n / \\\r\n / \\\r\n /______________________\\\r\n\r\n Dynamic Analysis Static Analysis\r\n\r\n Alice's source code: https://github.com/intel/dffml/tree/alice/entities/alice\r\n How we built Alice: https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice\r\n How to extend Alice: https://github.com/intel/dffml/blob/alice/entities/alice/CONTRIBUTING.rst\r\n Comment to get involved: https://github.com/intel/dffml/discussions/1406\r\n\r\n\r\npositional arguments:\r\n {please,shouldi,threats}\r\n\r\noptional arguments:\r\n -h, --help show this help message and exit\r\n -log LOG Logging Level\r\n```\r\n\r\n- TODO\r\n - [ ] Auto fork repo before push\r\n - [ ] Update origin to push to\r\n - [ ] Create PR\r\n - [ ] Update README to fix demos\r\n - [ ] Update CONTRIBUTING with tutorial on adding\r\n `CONTRIBUTING.md` check and contribution\r\n\r\n**entities/alice/alice/timelines.py**\r\n\r\n```python\r\n\"\"\"\r\nHelpers for the timelines we support\r\n\"\"\"\r\n\r\n# Trinity Day 0\r\nALICE_DAY_0_GREGORIAN = datetime.datetime(2022, 4, 16)\r\n\r\ndef date_alice_from_gregorian(date: str) -> int:\r\n # TODO\r\n return ALICE_DAY_0_GREGORIAN\r\n```\r\n\r\n```diff\r\ndiff --git a/dffml/base.py b/dffml/base.py\r\nindex fea0ef7220..9d6cd886fa 100644\r\n--- a/dffml/base.py\r\n+++ b/dffml/base.py\r\n@@ -237,6 +237,7 @@ def convert_value(arg, value, *, dataclass=None):\r\n # before checking if the value is an instance of that\r\n # type. Since it doesn't make sense to check if the\r\n # value is an instance of something that's not a type.\r\n+ print(possible_type, value)\r\n if isinstance(possible_type, type) and isinstance(\r\n value, possible_type\r\n ):\r\ndiff --git a/dffml/df/system_context/system_context.py b/dffml/df/system_context/system_context.py\r\nindex e055a343f1..063547ad0c 100644\r\n--- a/dffml/df/system_context/system_context.py\r\n+++ b/dffml/df/system_context/system_context.py\r\n@@ -90,11 +90,11 @@ class SystemContextConfig:\r\n # links: 'SystemContextConfig'\r\n overlay: Union[\"SystemContextConfig\", DataFlow] = field(\r\n \"The overlay we will apply with any overlays to merge within it (see default overlay usage docs)\",\r\n- default=APPLY_INSTALLED_OVERLAYS,\r\n+ default=None,\r\n )\r\n orchestrator: Union[\"SystemContextConfig\", BaseOrchestrator] = field(\r\n \"The system context who's default flow will be used to produce an orchestrator which will be used to execute this system context including application of overlays\",\r\n- default_factory=lambda: MemoryOrchestrator,\r\n+ default=None,\r\n )\r\n \r\n \r\n@@ -131,6 +131,7 @@ class SystemContext(BaseDataFlowFacilitatorObject):\r\n )\r\n # TODO(alice) Apply overlay\r\n if self.config.overlay not in (None, APPLY_INSTALLED_OVERLAYS):\r\n+ print(self.config.overlay)\r\n breakpoint()\r\n raise NotImplementedError(\r\n \"Application of overlays within SystemContext class entry not yet supported\"\r\ndiff --git a/dffml/high_level/dataflow.py b/dffml/high_level/dataflow.py\r\nindex d180b5c302..d595ae1cb4 100644\r\n--- a/dffml/high_level/dataflow.py\r\n+++ b/dffml/high_level/dataflow.py\r\n@@ -206,12 +206,25 @@ async def run(\r\n # the of the one that got passed in and the overlay.\r\n if inspect.isclass(overlay):\r\n overlay = overlay()\r\n+ # TODO Move this into Overlay.load. Create a system context to\r\n+ # execute the overlay if it is not already.\r\n+ known_overlay_types = (DataFlow, SystemContext)\r\n+ if not isinstance(overlay, known_overlay_types):\r\n+ raise NotImplementedError(f\"{overlay} is not a known type {known_overlay_types}\")\r\n+ if isinstance(overlay, DataFlow):\r\n+ overlay = SystemContext(\r\n+ upstream=overlay,\r\n+ )\r\n # TODO(alice) overlay.deployment(\"native.python.overlay.apply\")\r\n apply_overlay = overlay.deployment()\r\n async for _ctx, result in apply_overlay(\r\n dataflow=dataflow,\r\n ):\r\n+ print(\"FEEDFACE\", _ctx, result)\r\n+ breakpoint()\r\n+ return\r\n continue\r\n+\r\n # TODO\r\n resultant_system_context = SystemContext(\r\n upstream=result[\"overlays_merged\"], overlay=None,\r\ndiff --git a/dffml/overlay/overlay.py b/dffml/overlay/overlay.py\r\nindex 13a50d9c10..0a01d38de9 100644\r\n--- a/dffml/overlay/overlay.py\r\n+++ b/dffml/overlay/overlay.py\r\n@@ -124,7 +124,7 @@ DFFML_MAIN_PACKAGE_OVERLAY = DataFlow(\r\n stage=Stage.OUTPUT,\r\n inputs={\r\n \"merged\": DataFlowAfterOverlaysMerged,\r\n- \"dataflow_we_are_applying_overlays_to_by_running_overlay_dataflow_and_passing_as_an_input\": DataFlowWeAreApplyingOverlaysToByRunningOverlayDataflowAndPassingAsAnInput,\r\n+ \"upstream\": DataFlowWeAreApplyingOverlaysToByRunningOverlayDataflowAndPassingAsAnInput,\r\n },\r\n outputs={\"overlayed\": DataFlowAfterOverlaysApplied,},\r\n multi_output=False,\r\n@@ -208,15 +208,12 @@ merge_implementations(\r\n DFFML_OVERLAYS_INSTALLED.update(auto_flow=True)\r\n \r\n # Create Class for calling operations within the System Context as methods\r\n-DFFMLOverlaysInstalled = SystemContext.subclass(\r\n- \"DFFMLOverlaysInstalled\",\r\n- {\r\n- \"upstream\": {\"default_factory\": lambda: DFFML_OVERLAYS_INSTALLED},\r\n- # TODO(alice) We'll need to make sure we have code to instantiate and\r\n- # instance of a class if only a class is given an not an instance.\r\n- \"overlay\": {\"default_factory\": lambda: None},\r\n- \"orchestrator\": {\"default_factory\": lambda: MemoryOrchestrator()},\r\n- },\r\n+DFFMLOverlaysInstalled = SystemContext(\r\n+ upstream=DFFML_OVERLAYS_INSTALLED,\r\n+ # TODO(alice) We'll need to make sure we have code to instantiate and\r\n+ # instance of a class if only a class is given an not an instance.\r\n+ overlay=None,\r\n+ orchestrator=MemoryOrchestrator(),\r\n )\r\n \r\n # Callee\r\ndiff --git a/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py b/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py\r\nindex 46d20c8c85..fff5d4928b 100644\r\n--- a/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py\r\n+++ b/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py\r\n@@ -18,6 +18,14 @@ from ....recommended_community_standards import AliceGitRepo, AlicePleaseContrib\r\n from ....dffml.operations.git.contribute import AlicePleaseContributeRecommendedCommunityStandardsOverlayGit\r\n \r\n \r\n+GitHubIssue = NewType(\"GitHubIssue\", str)\r\n+\r\n+\r\n+@dataclasses.dataclass\r\n+class RecommendedCommunityStandardContribution:\r\n+ path: pathlib.Path\r\n+ issue: GitHubIssue\r\n+\r\n \r\n class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n \"\"\"\r\n@@ -39,6 +47,7 @@ class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n MetaIssueTitle = NewType(\"MetaIssueTitle\", str)\r\n MetaIssueBody = NewType(\"MetaIssueBody\", str)\r\n \r\n+ # TODO This should only be run if there is a need for a README\r\n # body: Optional['ContributingIssueBody'] = \"References:\\n- https://docs.github.com/articles/setting-guidelines-for-repository-contributors/\",\r\n async def readme_issue(\r\n self,\r\n@@ -79,13 +88,40 @@ class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n \"\"\"\r\n ).lstrip()\r\n \r\n- # TODO(alice) There is a bug with Optional which can be revield by use here\r\n+\r\n @staticmethod\r\n+ async def readme_contribution(\r\n+ issue: \"ReadmeIssue\",\r\n+ path: AlicePleaseContributeRecommendedCommunityStandards.ReadmePath,\r\n+ ) -> RecommendedCommunityStandardContribution:\r\n+ return RecommendedCommunityStandardContribution(\r\n+ path=path,\r\n+ issue=issue,\r\n+ )\r\n+\r\n+\r\n+ \"\"\"\r\n+ @dffml.op(\r\n+ stage=dffml.Stage.OUTPUT,\r\n+ )\r\n+ async def collect_recommended_community_standard_contributions(\r\n+ self,\r\n+ ) -> List[RecommendedCommunityStandardContribution]:\r\n+ async with self.octx.ictx.definitions(self.ctx) as od:\r\n+ return [item async for item in od.inputs(RecommendedCommunityStandardContribution)]\r\n+ \"\"\"\r\n+\r\n+\r\n+ # TODO(alice) There is a bug with Optional which can be revield by use here\r\n def meta_issue_body(\r\n repo: AliceGitRepo,\r\n base: AlicePleaseContributeRecommendedCommunityStandardsOverlayGit.BaseBranch,\r\n- readme_path: AlicePleaseContributeRecommendedCommunityStandards.ReadmePath,\r\n- readme_issue: ReadmeIssue,\r\n+ # recommended_community_standard_contributions: List[RecommendedCommunityStandardContribution],\r\n+ # TODO On @op inspect paramter if Collect is found on an input, wrap the\r\n+ # operation in a subflow and add a generic version of\r\n+ # collect_recommended_community_standard_contributions to the flow as an\r\n+ # autostart or triggered via auto start operation.\r\n+ # recommended_community_standard_contributions: Collect[List[RecommendedCommunityStandardContribution]],\r\n ) -> \"MetaIssueBody\":\r\n \"\"\"\r\n >>> AlicePleaseContributeRecommendedCommunityStandardsGitHubIssueOverlay.meta_issue_body(\r\n@@ -98,6 +134,7 @@ class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n - [] [License](https://github.com/intel/dffml/blob/main/LICENSE)\r\n - [] Security\r\n \"\"\"\r\n+ readme_issue, readme_path = recommended_community_standard_contributions[0]\r\n return \"\\n\".join(\r\n [\r\n \"- [\"\r\n```",
"createdAt": "2022-07-28T03:37:01Z",
"userContentEdits": {
"pageInfo": {
"endCursor": "Y3Vyc29yOnYyOpHOABPLRQ==",
"hasNextPage": false
},
"totalCount": 9,
"nodes": [
{
"diff": "## 2022-07-27 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n\r\n### Refactoring and Thinking About Locking of Repos for Contributions\r\n\r\n- Metadata\r\n - Date: 2022-07-27 20:30 UTC -7\r\n- Saving this diff which was some work on dynamic application of overlay\r\n so as to support fixup of the OpImp for `meta_issue_body()`'s inputs.\r\n - We are going to table this for now for time reasons, but if someone\r\n wants to pick it up before @pdxjohnny is back in September, please\r\n give it a go (create an issue).\r\n- Noticed that we have an issue with adding new files and locking. The current\r\n lock is on the `git_repository/GitRepoSpec`.\r\n - We then convert to `AliceGitRepo`, at which point anything take `AliceGitRepo`\r\n- `alice`\r\n - Goal: Display Alice and software analysis trinity\r\n - https://free-images.com/search/?q=alice%27s+adventures+in+wonderland&cat=st\r\n - https://free-images.com/display/de_alices_abenteuer_im_43.html\r\n - https://github.com/KhorSL/ASCII-ART\r\n - Completed in d067273f8571b6a56733336663aaebc3acb3a701\r\n\r\n![alice looking up](https://user-images.githubusercontent.com/5950433/181431145-18cfc8a7-28c8-486f-80f9-8b250e0b0943.png)\r\n\r\n```console\r\n$ python ascii_art.py /mnt/c/Users/Johnny/Downloads/alice-looking-up-white-background.png\r\n```\r\n\r\n```console\r\n$ alice\r\nusage: alice [-h] [-log LOG] {please,shouldi,threats} ...\r\n\r\n .,*&&888@@#&:,\r\n .:&::,...,:&#@@@#:.\r\n .o,. ..:8@@#@@+\r\n .8o+,+o*+*+,+:&#@@#8@@.\r\n &8&###@#&..*:8#@@#@#@@&+.\r\n ,@:#@##@@8,:&#@@@###@88@@.\r\n ,#@8&#@@@#o:#@@@@#8#@#8+&#.\r\n +8####@@@@###@@@888#@@@#oo#.\r\n .*8@###@@@@@@@@@#o*#@@#@@#8o@,\r\n +###@#o8&#@@##8::##@@@&&#@8#&+\r\n o@8&#&##::.,o&+88#&8##8*@@#@#,\r\n .##888&&oo#&o8###8&o##8##&####8,\r\n .&#@8&:+o+&@@@#8#&8:8@@@@@#8@@@oo+\r\n ,&&#@##oo+*:@###&#88,@@@@#@o&##&8#@o,.\r\n ,#&###@@8:*,#o&@@@@##:&#@###*.&o++o#@@#&+\r\n o8&8o8@#8+,,#.88#@#&@&&#@##++*&#o&&&#@@@@.\r\n *88:,#8&#,o+:+@&8#:8@8&8#@@&o++,*++*+:#@@*.\r\n .+#:o###@8o&8*@o&o8@o888@@@o+:o*&&,@#:&@@@,\r\n *+&@8&#@o#8+8*#+8#+88@@@@@@&@###8##@8:*,\r\n +o.@##@@@&88@*8@:8@@@@@@:.. ,8@:++.\r\n +&++8@@@@##@@@@@@@@@@@+ 88\r\n &. *@8@:+##o&888#@@@, .#+\r\n &. ,@+o,.::+*+*:&#&, ,@.\r\n &. .@8*,. ,*+++.+* :8+\r\n :+ .#@::. .8:.:** .8@@o,\r\n .o. #@+ :@,.&* .:@@@@@@8**.\r\n +&. :@o,+.*o,*, .*@@@@@@@@@@#o\r\n .*:&o. 8@o:,*:, .o@@#8&&@@@@#@@@*\r\n ,*:+:::o.*&8+,++ ,&@@#: * :@@88@@@#:.\r\n ,::**:o:.,&*+*8: *8@@##o *,.8@@#8#@#@#+\r\n *:+*&o8:. ,o,o:8@+o@@88:*@+ +: +#@#####8##&.\r\n ,:&::88&, .&:#o#@@@#,+&&*#&. .:,.&#@#88#####&,\r\n +::o+&8:. :##88@@@@:.:8o+&8&. .. +8###&8&##&88*\r\n .:*+*.8#: ,o*.+&@@#@8,,o8*+8##+ .+#8##8&&#8888:.\r\n ,:o., &#8. .:8*. .o, &#,*:8:+,&*:, .8@@#o&&##8:&#8.\r\n .*o.*,+o8#* +8&, .::. .88.+:8o: ,+:, ,o#@#8&o8##&#8+\r\n +o, .+,,o#8+,8@o**.,o*, :8o +*8#* +&, ,*o@@#@&8&oo8&:,\r\n oo*+,,,*8@#..&@8:**:oo+. +8#* *+#@:...oo+ .**:8@@@ooo&:&o##+\r\n ::+..,++#@,.:##o&o**,....oo#++#8#@:.,:8&:.....*&@@#:oo*&oo&#@*\r\n .+**:*8@o,+##&o:+,,,+,,o*8#,,8@#@:,,+*o*++,,,,+&#@8*8o88&::*. .,,,,,++,\r\n ..8@++#@#88:,,,.,,,:+#&,,#@@#:,,.,&o*,.+++*:#@8+:*+. ......,:+*&,,.....\r\n +:&8#@@##8&+,,,***@&,.8@@@*,,,.:o8&o&*o&o&o. .,.****::*:o*:o*o+,.\r\n ...,*:*o&&o*8@@&o8@@@8+,,+:&&:+,... ,++*&oo&8&&&oo#@##8#&8:.\r\n o@#@@@@#@@@@@@@,..... ..,,.+*::o#@##@##@#@#########@@@8:,.\r\n ,@##@@88#@@@@@8 .:***oo*#8###8#@#@#@#@####@#@###@@#8&#:\r\n 8+.,8+..,*o#@+ ,o+o88&88###@8#######@8#8#88#8#88##88#&\r\n *o *+ #8 . ,*o&#@##@@@@@@@@@######8#888&&oo:8:\r\n 8, ,& +@* .ooo&#@@@@@#@@@@@@####@##8#8##oo:o&:,\r\n +& &, .@#. .:8#@@@@@@@@@@##8#####8#o&&#8*:8&&8:\r\n o* ,o o@& +o#@@@@@@@@#o&o88:&+ooo&:*::o:o&**o.:*+\r\n .8. 8.,o#8 .+&#@@@@@@@@&o+,::*+*:+:, ,. ,.. .,. ,.\r\n 8. 8.,.&@:*:&@@@@@@@@8o+, ,.\r\n :@o:#,,o8&:o&@@@@#&:+.\r\n .@@@@@@@@@@@#8&o+,\r\n ,*:&#@#&o*,..\r\n\r\n /\\\r\n / \\\r\n Intent\r\n / \\\r\n / \\\r\n / \\\r\n / \\\r\n / \\\r\n / Alice is Here \\\r\n / \\\r\n / \\\r\n /______________________\\\r\n\r\n Dynamic Analysis Static Analysis\r\n\r\n Alice's source code: https://github.com/intel/dffml/tree/alice/entities/alice\r\n How we built Alice: https://github.com/intel/dffml/tree/alice/docs/tutorials/rolling_alice\r\n How to extend Alice: https://github.com/intel/dffml/blob/alice/entities/alice/CONTRIBUTING.rst\r\n Comment to get involved: https://github.com/intel/dffml/discussions/1406\r\n\r\n\r\npositional arguments:\r\n {please,shouldi,threats}\r\n\r\noptional arguments:\r\n -h, --help show this help message and exit\r\n -log LOG Logging Level\r\n```\r\n\r\n- TODO\r\n - [ ] Auto fork repo before push\r\n - [ ] Update origin to push to\r\n - [ ] Create PR\r\n - [ ] Update README to fix demos\r\n - [ ] Update CONTRIBUTING with tutorial on adding\r\n `CONTRIBUTING.md` check and contribution\r\n\r\n**entities/alice/alice/timelines.py**\r\n\r\n```python\r\n\"\"\"\r\nHelpers for the timelines we support\r\n\"\"\"\r\n\r\n# Trinity Day 0\r\nALICE_DAY_0_GREGORIAN = datetime.datetime(2022, 4, 16)\r\n\r\ndef date_alice_from_gregorian(date: str) -> int:\r\n # TODO\r\n return ALICE_DAY_0_GREGORIAN\r\n```\r\n\r\n```diff\r\ndiff --git a/dffml/base.py b/dffml/base.py\r\nindex fea0ef7220..9d6cd886fa 100644\r\n--- a/dffml/base.py\r\n+++ b/dffml/base.py\r\n@@ -237,6 +237,7 @@ def convert_value(arg, value, *, dataclass=None):\r\n # before checking if the value is an instance of that\r\n # type. Since it doesn't make sense to check if the\r\n # value is an instance of something that's not a type.\r\n+ print(possible_type, value)\r\n if isinstance(possible_type, type) and isinstance(\r\n value, possible_type\r\n ):\r\ndiff --git a/dffml/df/system_context/system_context.py b/dffml/df/system_context/system_context.py\r\nindex e055a343f1..063547ad0c 100644\r\n--- a/dffml/df/system_context/system_context.py\r\n+++ b/dffml/df/system_context/system_context.py\r\n@@ -90,11 +90,11 @@ class SystemContextConfig:\r\n # links: 'SystemContextConfig'\r\n overlay: Union[\"SystemContextConfig\", DataFlow] = field(\r\n \"The overlay we will apply with any overlays to merge within it (see default overlay usage docs)\",\r\n- default=APPLY_INSTALLED_OVERLAYS,\r\n+ default=None,\r\n )\r\n orchestrator: Union[\"SystemContextConfig\", BaseOrchestrator] = field(\r\n \"The system context who's default flow will be used to produce an orchestrator which will be used to execute this system context including application of overlays\",\r\n- default_factory=lambda: MemoryOrchestrator,\r\n+ default=None,\r\n )\r\n \r\n \r\n@@ -131,6 +131,7 @@ class SystemContext(BaseDataFlowFacilitatorObject):\r\n )\r\n # TODO(alice) Apply overlay\r\n if self.config.overlay not in (None, APPLY_INSTALLED_OVERLAYS):\r\n+ print(self.config.overlay)\r\n breakpoint()\r\n raise NotImplementedError(\r\n \"Application of overlays within SystemContext class entry not yet supported\"\r\ndiff --git a/dffml/high_level/dataflow.py b/dffml/high_level/dataflow.py\r\nindex d180b5c302..d595ae1cb4 100644\r\n--- a/dffml/high_level/dataflow.py\r\n+++ b/dffml/high_level/dataflow.py\r\n@@ -206,12 +206,25 @@ async def run(\r\n # the of the one that got passed in and the overlay.\r\n if inspect.isclass(overlay):\r\n overlay = overlay()\r\n+ # TODO Move this into Overlay.load. Create a system context to\r\n+ # execute the overlay if it is not already.\r\n+ known_overlay_types = (DataFlow, SystemContext)\r\n+ if not isinstance(overlay, known_overlay_types):\r\n+ raise NotImplementedError(f\"{overlay} is not a known type {known_overlay_types}\")\r\n+ if isinstance(overlay, DataFlow):\r\n+ overlay = SystemContext(\r\n+ upstream=overlay,\r\n+ )\r\n # TODO(alice) overlay.deployment(\"native.python.overlay.apply\")\r\n apply_overlay = overlay.deployment()\r\n async for _ctx, result in apply_overlay(\r\n dataflow=dataflow,\r\n ):\r\n+ print(\"FEEDFACE\", _ctx, result)\r\n+ breakpoint()\r\n+ return\r\n continue\r\n+\r\n # TODO\r\n resultant_system_context = SystemContext(\r\n upstream=result[\"overlays_merged\"], overlay=None,\r\ndiff --git a/dffml/overlay/overlay.py b/dffml/overlay/overlay.py\r\nindex 13a50d9c10..0a01d38de9 100644\r\n--- a/dffml/overlay/overlay.py\r\n+++ b/dffml/overlay/overlay.py\r\n@@ -124,7 +124,7 @@ DFFML_MAIN_PACKAGE_OVERLAY = DataFlow(\r\n stage=Stage.OUTPUT,\r\n inputs={\r\n \"merged\": DataFlowAfterOverlaysMerged,\r\n- \"dataflow_we_are_applying_overlays_to_by_running_overlay_dataflow_and_passing_as_an_input\": DataFlowWeAreApplyingOverlaysToByRunningOverlayDataflowAndPassingAsAnInput,\r\n+ \"upstream\": DataFlowWeAreApplyingOverlaysToByRunningOverlayDataflowAndPassingAsAnInput,\r\n },\r\n outputs={\"overlayed\": DataFlowAfterOverlaysApplied,},\r\n multi_output=False,\r\n@@ -208,15 +208,12 @@ merge_implementations(\r\n DFFML_OVERLAYS_INSTALLED.update(auto_flow=True)\r\n \r\n # Create Class for calling operations within the System Context as methods\r\n-DFFMLOverlaysInstalled = SystemContext.subclass(\r\n- \"DFFMLOverlaysInstalled\",\r\n- {\r\n- \"upstream\": {\"default_factory\": lambda: DFFML_OVERLAYS_INSTALLED},\r\n- # TODO(alice) We'll need to make sure we have code to instantiate and\r\n- # instance of a class if only a class is given an not an instance.\r\n- \"overlay\": {\"default_factory\": lambda: None},\r\n- \"orchestrator\": {\"default_factory\": lambda: MemoryOrchestrator()},\r\n- },\r\n+DFFMLOverlaysInstalled = SystemContext(\r\n+ upstream=DFFML_OVERLAYS_INSTALLED,\r\n+ # TODO(alice) We'll need to make sure we have code to instantiate and\r\n+ # instance of a class if only a class is given an not an instance.\r\n+ overlay=None,\r\n+ orchestrator=MemoryOrchestrator(),\r\n )\r\n \r\n # Callee\r\ndiff --git a/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py b/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py\r\nindex 46d20c8c85..fff5d4928b 100644\r\n--- a/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py\r\n+++ b/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py\r\n@@ -18,6 +18,14 @@ from ....recommended_community_standards import AliceGitRepo, AlicePleaseContrib\r\n from ....dffml.operations.git.contribute import AlicePleaseContributeRecommendedCommunityStandardsOverlayGit\r\n \r\n \r\n+GitHubIssue = NewType(\"GitHubIssue\", str)\r\n+\r\n+\r\n+@dataclasses.dataclass\r\n+class RecommendedCommunityStandardContribution:\r\n+ path: pathlib.Path\r\n+ issue: GitHubIssue\r\n+\r\n \r\n class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n \"\"\"\r\n@@ -39,6 +47,7 @@ class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n MetaIssueTitle = NewType(\"MetaIssueTitle\", str)\r\n MetaIssueBody = NewType(\"MetaIssueBody\", str)\r\n \r\n+ # TODO This should only be run if there is a need for a README\r\n # body: Optional['ContributingIssueBody'] = \"References:\\n- https://docs.github.com/articles/setting-guidelines-for-repository-contributors/\",\r\n async def readme_issue(\r\n self,\r\n@@ -79,13 +88,40 @@ class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n \"\"\"\r\n ).lstrip()\r\n \r\n- # TODO(alice) There is a bug with Optional which can be revield by use here\r\n+\r\n @staticmethod\r\n+ async def readme_contribution(\r\n+ issue: \"ReadmeIssue\",\r\n+ path: AlicePleaseContributeRecommendedCommunityStandards.ReadmePath,\r\n+ ) -> RecommendedCommunityStandardContribution:\r\n+ return RecommendedCommunityStandardContribution(\r\n+ path=path,\r\n+ issue=issue,\r\n+ )\r\n+\r\n+\r\n+ \"\"\"\r\n+ @dffml.op(\r\n+ stage=dffml.Stage.OUTPUT,\r\n+ )\r\n+ async def collect_recommended_community_standard_contributions(\r\n+ self,\r\n+ ) -> List[RecommendedCommunityStandardContribution]:\r\n+ async with self.octx.ictx.definitions(self.ctx) as od:\r\n+ return [item async for item in od.inputs(RecommendedCommunityStandardContribution)]\r\n+ \"\"\"\r\n+\r\n+\r\n+ # TODO(alice) There is a bug with Optional which can be revield by use here\r\n def meta_issue_body(\r\n repo: AliceGitRepo,\r\n base: AlicePleaseContributeRecommendedCommunityStandardsOverlayGit.BaseBranch,\r\n- readme_path: AlicePleaseContributeRecommendedCommunityStandards.ReadmePath,\r\n- readme_issue: ReadmeIssue,\r\n+ # recommended_community_standard_contributions: List[RecommendedCommunityStandardContribution],\r\n+ # TODO On @op inspect paramter if Collect is found on an input, wrap the\r\n+ # operation in a subflow and add a generic version of\r\n+ # collect_recommended_community_standard_contributions to the flow as an\r\n+ # autostart or triggered via auto start operation.\r\n+ # recommended_community_standard_contributions: Collect[List[RecommendedCommunityStandardContribution]],\r\n ) -> \"MetaIssueBody\":\r\n \"\"\"\r\n >>> AlicePleaseContributeRecommendedCommunityStandardsGitHubIssueOverlay.meta_issue_body(\r\n@@ -98,6 +134,7 @@ class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n - [] [License](https://github.com/intel/dffml/blob/main/LICENSE)\r\n - [] Security\r\n \"\"\"\r\n+ readme_issue, readme_path = recommended_community_standard_contributions[0]\r\n return \"\\n\".join(\r\n [\r\n \"- [\"\r\n```",
"editedAt": "2022-07-28T06:53:11Z"
},
{
"diff": "## 2022-07-27 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n\r\n### Refactoring and Thinking About Locking of Repos for Contributions\r\n\r\n- Metadata\r\n - Date: 2022-07-27 20:30 UTC -7\r\n- Saving this diff which was some work on dynamic application of overlay\r\n so as to support fixup of the OpImp for `meta_issue_body()`'s inputs.\r\n - We are going to table this for now for time reasons, but if someone\r\n wants to pick it up before @pdxjohnny is back in September, please\r\n give it a go (create an issue).\r\n- Noticed that we have an issue with adding new files and locking. The current\r\n lock is on the `git_repository/GitRepoSpec`.\r\n - We then convert to `AliceGitRepo`, at which point anything take `AliceGitRepo`\r\n- `alice`\r\n - Goal: Display Alice and software analysis trinity\r\n - https://free-images.com/search/?q=alice%27s+adventures+in+wonderland&cat=st\r\n - https://free-images.com/display/de_alices_abenteuer_im_43.html\r\n - https://github.com/KhorSL/ASCII-ART\r\n - Completed in d067273f8571b6a56733336663aaebc3acb3a701\r\n\r\n![alice looking up](https://user-images.githubusercontent.com/5950433/181431145-18cfc8a7-28c8-486f-80f9-8b250e0b0943.png)\r\n\r\n\r\n```\r\n$ python ascii_art.py /mnt/c/Users/Johnny/Downloads/alice-looking-up-white-background.png\r\n .,*&&888@@#&:,\r\n .:&::,...,:&#@@@#:.\r\n .o,. ..:8@@#@@+\r\n .8o+,+o*+*+,+:&#@@#8@@.\r\n &8&###@#&..*:8#@@#@#@@&+.\r\n ,@:#@##@@8,:&#@@@###@88@@.\r\n ,#@8&#@@@#o:#@@@@#8#@#8+&#.\r\n +8####@@@@###@@@888#@@@#oo#.\r\n .*8@###@@@@@@@@@#o*#@@#@@#8o@,\r\n +###@#o8&#@@##8::##@@@&&#@8#&+\r\n o@8&#&##::.,o&+88#&8##8*@@#@#,\r\n .##888&&oo#&o8###8&o##8##&####8,\r\n .&#@8&:+o+&@@@#8#&8:8@@@@@#8@@@oo+\r\n ,&&#@##oo+*:@###&#88,@@@@#@o&##&8#@o,.\r\n ,#&###@@8:*,#o&@@@@##:&#@###*.&o++o#@@#&+\r\n o8&8o8@#8+,,#.88#@#&@&&#@##++*&#o&&&#@@@@.\r\n *88:,#8&#,o+:+@&8#:8@8&8#@@&o++,*++*+:#@@*.\r\n .+#:o###@8o&8*@o&o8@o888@@@o+:o*&&,@#:&@@@,\r\n *+&@8&#@o#8+8*#+8#+88@@@@@@&@###8##@8:*,\r\n +o.@##@@@&88@*8@:8@@@@@@:.. ,8@:++.\r\n +&++8@@@@##@@@@@@@@@@@+ 88\r\n &. *@8@:+##o&888#@@@, .#+\r\n &. ,@+o,.::+*+*:&#&, ,@.\r\n &. .@8*,. ,*+++.+* :8+\r\n :+ .#@::. .8:.:** .8@@o,\r\n .o. #@+ :@,.&* .:@@@@@@8**.\r\n +&. :@o,+.*o,*, .*@@@@@@@@@@#o\r\n .*:&o. 8@o:,*:, .o@@#8&&@@@@#@@@*\r\n ,*:+:::o.*&8+,++ ,&@@#: * :@@88@@@#:.\r\n ,::**:o:.,&*+*8: *8@@##o *,.8@@#8#@#@#+\r\n *:+*&o8:. ,o,o:8@+o@@88:*@+ +: +#@#####8##&.\r\n ,:&::88&, .&:#o#@@@#,+&&*#&. .:,.&#@#88#####&,\r\n +::o+&8:. :##88@@@@:.:8o+&8&. .. +8###&8&##&88*\r\n .:*+*.8#: ,o*.+&@@#@8,,o8*+8##+ .+#8##8&&#8888:.\r\n ,:o., &#8. .:8*. .o, &#,*:8:+,&*:, .8@@#o&&##8:&#8.\r\n .*o.*,+o8#* +8&, .::. .88.+:8o: ,+:, ,o#@#8&o8##&#8+\r\n +o, .+,,o#8+,8@o**.,o*, :8o +*8#* +&, ,*o@@#@&8&oo8&:,\r\noo*+,,,*8@#..&@8:**:oo+. +8#* *+#@:...oo+ .**:8@@@ooo&:&o##+\r\n::+..,++#@,.:##o&o**,....oo#++#8#@:.,:8&:.....*&@@#:oo*&oo&#@*\r\n .+**:*8@o,+##&o:+,,,+,,o*8#,,8@#@:,,+*o*++,,,,+&#@8*8o88&::*. .,,,,,++,\r\n ..8@++#@#88:,,,.,,,:+#&,,#@@#:,,.,&o*,.+++*:#@8+:*+. ......,:+*&,,.....\r\n +:&8#@@##8&+,,,***@&,.8@@@*,,,.:o8&o&*o&o&o. .,.****::*:o*:o*o+,.\r\n ...,*:*o&&o*8@@&o8@@@8+,,+:&&:+,... ,++*&oo&8&&&oo#@##8#&8:.\r\n o@#@@@@#@@@@@@@,..... ..,,.+*::o#@##@##@#@#########@@@8:,.\r\n ,@##@@88#@@@@@8 .:***oo*#8###8#@#@#@#@####@#@###@@#8&#:\r\n 8+.,8+..,*o#@+ ,o+o88&88###@8#######@8#8#88#8#88##88#&\r\n *o *+ #8 . ,*o&#@##@@@@@@@@@######8#888&&oo:8:\r\n 8, ,& +@* .ooo&#@@@@@#@@@@@@####@##8#8##oo:o&:,\r\n +& &, .@#. .:8#@@@@@@@@@@##8#####8#o&&#8*:8&&8:\r\n o* ,o o@& +o#@@@@@@@@#o&o88:&+ooo&:*::o:o&**o.:*+\r\n .8. 8.,o#8 .+&#@@@@@@@@&o+,::*+*:+:, ,. ,.. .,. ,.\r\n 8. 8.,.&@:*:&@@@@@@@@8o+, ,.\r\n :@o:#,,o8&:o&@@@@#&:+.\r\n .@@@@@@@@@@@#8&o+,\r\n ,*:&#@#&o*,..\r\n```\r\n\r\n- TODO\r\n - [ ] Auto fork repo before push\r\n - [ ] Update origin to push to\r\n - [ ] Create PR\r\n - [ ] Update README to fix demos\r\n - [ ] Update CONTRIBUTING with tutorial on adding\r\n `CONTRIBUTING.md` check and contribution\r\n\r\n**entities/alice/alice/timelines.py**\r\n\r\n```python\r\n\"\"\"\r\nHelpers for the timelines we support\r\n\"\"\"\r\n\r\n# Trinity Day 0\r\nALICE_DAY_0_GREGORIAN = datetime.datetime(2022, 4, 16)\r\n\r\ndef date_alice_from_gregorian(date: str) -> int:\r\n # TODO\r\n return ALICE_DAY_0_GREGORIAN\r\n```\r\n\r\n```diff\r\ndiff --git a/dffml/base.py b/dffml/base.py\r\nindex fea0ef7220..9d6cd886fa 100644\r\n--- a/dffml/base.py\r\n+++ b/dffml/base.py\r\n@@ -237,6 +237,7 @@ def convert_value(arg, value, *, dataclass=None):\r\n # before checking if the value is an instance of that\r\n # type. Since it doesn't make sense to check if the\r\n # value is an instance of something that's not a type.\r\n+ print(possible_type, value)\r\n if isinstance(possible_type, type) and isinstance(\r\n value, possible_type\r\n ):\r\ndiff --git a/dffml/df/system_context/system_context.py b/dffml/df/system_context/system_context.py\r\nindex e055a343f1..063547ad0c 100644\r\n--- a/dffml/df/system_context/system_context.py\r\n+++ b/dffml/df/system_context/system_context.py\r\n@@ -90,11 +90,11 @@ class SystemContextConfig:\r\n # links: 'SystemContextConfig'\r\n overlay: Union[\"SystemContextConfig\", DataFlow] = field(\r\n \"The overlay we will apply with any overlays to merge within it (see default overlay usage docs)\",\r\n- default=APPLY_INSTALLED_OVERLAYS,\r\n+ default=None,\r\n )\r\n orchestrator: Union[\"SystemContextConfig\", BaseOrchestrator] = field(\r\n \"The system context who's default flow will be used to produce an orchestrator which will be used to execute this system context including application of overlays\",\r\n- default_factory=lambda: MemoryOrchestrator,\r\n+ default=None,\r\n )\r\n \r\n \r\n@@ -131,6 +131,7 @@ class SystemContext(BaseDataFlowFacilitatorObject):\r\n )\r\n # TODO(alice) Apply overlay\r\n if self.config.overlay not in (None, APPLY_INSTALLED_OVERLAYS):\r\n+ print(self.config.overlay)\r\n breakpoint()\r\n raise NotImplementedError(\r\n \"Application of overlays within SystemContext class entry not yet supported\"\r\ndiff --git a/dffml/high_level/dataflow.py b/dffml/high_level/dataflow.py\r\nindex d180b5c302..d595ae1cb4 100644\r\n--- a/dffml/high_level/dataflow.py\r\n+++ b/dffml/high_level/dataflow.py\r\n@@ -206,12 +206,25 @@ async def run(\r\n # the of the one that got passed in and the overlay.\r\n if inspect.isclass(overlay):\r\n overlay = overlay()\r\n+ # TODO Move this into Overlay.load. Create a system context to\r\n+ # execute the overlay if it is not already.\r\n+ known_overlay_types = (DataFlow, SystemContext)\r\n+ if not isinstance(overlay, known_overlay_types):\r\n+ raise NotImplementedError(f\"{overlay} is not a known type {known_overlay_types}\")\r\n+ if isinstance(overlay, DataFlow):\r\n+ overlay = SystemContext(\r\n+ upstream=overlay,\r\n+ )\r\n # TODO(alice) overlay.deployment(\"native.python.overlay.apply\")\r\n apply_overlay = overlay.deployment()\r\n async for _ctx, result in apply_overlay(\r\n dataflow=dataflow,\r\n ):\r\n+ print(\"FEEDFACE\", _ctx, result)\r\n+ breakpoint()\r\n+ return\r\n continue\r\n+\r\n # TODO\r\n resultant_system_context = SystemContext(\r\n upstream=result[\"overlays_merged\"], overlay=None,\r\ndiff --git a/dffml/overlay/overlay.py b/dffml/overlay/overlay.py\r\nindex 13a50d9c10..0a01d38de9 100644\r\n--- a/dffml/overlay/overlay.py\r\n+++ b/dffml/overlay/overlay.py\r\n@@ -124,7 +124,7 @@ DFFML_MAIN_PACKAGE_OVERLAY = DataFlow(\r\n stage=Stage.OUTPUT,\r\n inputs={\r\n \"merged\": DataFlowAfterOverlaysMerged,\r\n- \"dataflow_we_are_applying_overlays_to_by_running_overlay_dataflow_and_passing_as_an_input\": DataFlowWeAreApplyingOverlaysToByRunningOverlayDataflowAndPassingAsAnInput,\r\n+ \"upstream\": DataFlowWeAreApplyingOverlaysToByRunningOverlayDataflowAndPassingAsAnInput,\r\n },\r\n outputs={\"overlayed\": DataFlowAfterOverlaysApplied,},\r\n multi_output=False,\r\n@@ -208,15 +208,12 @@ merge_implementations(\r\n DFFML_OVERLAYS_INSTALLED.update(auto_flow=True)\r\n \r\n # Create Class for calling operations within the System Context as methods\r\n-DFFMLOverlaysInstalled = SystemContext.subclass(\r\n- \"DFFMLOverlaysInstalled\",\r\n- {\r\n- \"upstream\": {\"default_factory\": lambda: DFFML_OVERLAYS_INSTALLED},\r\n- # TODO(alice) We'll need to make sure we have code to instantiate and\r\n- # instance of a class if only a class is given an not an instance.\r\n- \"overlay\": {\"default_factory\": lambda: None},\r\n- \"orchestrator\": {\"default_factory\": lambda: MemoryOrchestrator()},\r\n- },\r\n+DFFMLOverlaysInstalled = SystemContext(\r\n+ upstream=DFFML_OVERLAYS_INSTALLED,\r\n+ # TODO(alice) We'll need to make sure we have code to instantiate and\r\n+ # instance of a class if only a class is given an not an instance.\r\n+ overlay=None,\r\n+ orchestrator=MemoryOrchestrator(),\r\n )\r\n \r\n # Callee\r\ndiff --git a/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py b/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py\r\nindex 46d20c8c85..fff5d4928b 100644\r\n--- a/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py\r\n+++ b/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py\r\n@@ -18,6 +18,14 @@ from ....recommended_community_standards import AliceGitRepo, AlicePleaseContrib\r\n from ....dffml.operations.git.contribute import AlicePleaseContributeRecommendedCommunityStandardsOverlayGit\r\n \r\n \r\n+GitHubIssue = NewType(\"GitHubIssue\", str)\r\n+\r\n+\r\n+@dataclasses.dataclass\r\n+class RecommendedCommunityStandardContribution:\r\n+ path: pathlib.Path\r\n+ issue: GitHubIssue\r\n+\r\n \r\n class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n \"\"\"\r\n@@ -39,6 +47,7 @@ class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n MetaIssueTitle = NewType(\"MetaIssueTitle\", str)\r\n MetaIssueBody = NewType(\"MetaIssueBody\", str)\r\n \r\n+ # TODO This should only be run if there is a need for a README\r\n # body: Optional['ContributingIssueBody'] = \"References:\\n- https://docs.github.com/articles/setting-guidelines-for-repository-contributors/\",\r\n async def readme_issue(\r\n self,\r\n@@ -79,13 +88,40 @@ class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n \"\"\"\r\n ).lstrip()\r\n \r\n- # TODO(alice) There is a bug with Optional which can be revield by use here\r\n+\r\n @staticmethod\r\n+ async def readme_contribution(\r\n+ issue: \"ReadmeIssue\",\r\n+ path: AlicePleaseContributeRecommendedCommunityStandards.ReadmePath,\r\n+ ) -> RecommendedCommunityStandardContribution:\r\n+ return RecommendedCommunityStandardContribution(\r\n+ path=path,\r\n+ issue=issue,\r\n+ )\r\n+\r\n+\r\n+ \"\"\"\r\n+ @dffml.op(\r\n+ stage=dffml.Stage.OUTPUT,\r\n+ )\r\n+ async def collect_recommended_community_standard_contributions(\r\n+ self,\r\n+ ) -> List[RecommendedCommunityStandardContribution]:\r\n+ async with self.octx.ictx.definitions(self.ctx) as od:\r\n+ return [item async for item in od.inputs(RecommendedCommunityStandardContribution)]\r\n+ \"\"\"\r\n+\r\n+\r\n+ # TODO(alice) There is a bug with Optional which can be revield by use here\r\n def meta_issue_body(\r\n repo: AliceGitRepo,\r\n base: AlicePleaseContributeRecommendedCommunityStandardsOverlayGit.BaseBranch,\r\n- readme_path: AlicePleaseContributeRecommendedCommunityStandards.ReadmePath,\r\n- readme_issue: ReadmeIssue,\r\n+ # recommended_community_standard_contributions: List[RecommendedCommunityStandardContribution],\r\n+ # TODO On @op inspect paramter if Collect is found on an input, wrap the\r\n+ # operation in a subflow and add a generic version of\r\n+ # collect_recommended_community_standard_contributions to the flow as an\r\n+ # autostart or triggered via auto start operation.\r\n+ # recommended_community_standard_contributions: Collect[List[RecommendedCommunityStandardContribution]],\r\n ) -> \"MetaIssueBody\":\r\n \"\"\"\r\n >>> AlicePleaseContributeRecommendedCommunityStandardsGitHubIssueOverlay.meta_issue_body(\r\n@@ -98,6 +134,7 @@ class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n - [] [License](https://github.com/intel/dffml/blob/main/LICENSE)\r\n - [] Security\r\n \"\"\"\r\n+ readme_issue, readme_path = recommended_community_standard_contributions[0]\r\n return \"\\n\".join(\r\n [\r\n \"- [\"\r\n```",
"editedAt": "2022-07-28T06:52:21Z"
},
{
"diff": "## 2022-07-27 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n\r\n### Refactoring and Thinking About Locking of Repos for Contributions\r\n\r\n- Metadata\r\n - Date: 2022-07-27 20:30 UTC -7\r\n- Saving this diff which was some work on dynamic application of overlay\r\n so as to support fixup of the OpImp for `meta_issue_body()`'s inputs.\r\n - We are going to table this for now for time reasons, but if someone\r\n wants to pick it up before @pdxjohnny is back in September, please\r\n give it a go (create an issue).\r\n- Noticed that we have an issue with adding new files and locking. The current\r\n lock is on the `git_repository/GitRepoSpec`.\r\n - We then convert to `AliceGitRepo`, at which point anything take `AliceGitRepo`\r\n- `alice version`\r\n - https://free-images.com/search/?q=alice%27s+adventures+in+wonderland&cat=st\r\n - https://free-images.com/display/de_alices_abenteuer_im_43.html\r\n - https://github.com/KhorSL/ASCII-ART\r\n\r\n![alice looking up](https://user-images.githubusercontent.com/5950433/181431145-18cfc8a7-28c8-486f-80f9-8b250e0b0943.png)\r\n\r\n\r\n```\r\n$ python ascii_art.py /mnt/c/Users/Johnny/Downloads/alice-looking-up-white-background.png\r\n .,*&&888@@#&:,\r\n .:&::,...,:&#@@@#:.\r\n .o,. ..:8@@#@@+\r\n .8o+,+o*+*+,+:&#@@#8@@.\r\n &8&###@#&..*:8#@@#@#@@&+.\r\n ,@:#@##@@8,:&#@@@###@88@@.\r\n ,#@8&#@@@#o:#@@@@#8#@#8+&#.\r\n +8####@@@@###@@@888#@@@#oo#.\r\n .*8@###@@@@@@@@@#o*#@@#@@#8o@,\r\n +###@#o8&#@@##8::##@@@&&#@8#&+\r\n o@8&#&##::.,o&+88#&8##8*@@#@#,\r\n .##888&&oo#&o8###8&o##8##&####8,\r\n .&#@8&:+o+&@@@#8#&8:8@@@@@#8@@@oo+\r\n ,&&#@##oo+*:@###&#88,@@@@#@o&##&8#@o,.\r\n ,#&###@@8:*,#o&@@@@##:&#@###*.&o++o#@@#&+\r\n o8&8o8@#8+,,#.88#@#&@&&#@##++*&#o&&&#@@@@.\r\n *88:,#8&#,o+:+@&8#:8@8&8#@@&o++,*++*+:#@@*.\r\n .+#:o###@8o&8*@o&o8@o888@@@o+:o*&&,@#:&@@@,\r\n *+&@8&#@o#8+8*#+8#+88@@@@@@&@###8##@8:*,\r\n +o.@##@@@&88@*8@:8@@@@@@:.. ,8@:++.\r\n +&++8@@@@##@@@@@@@@@@@+ 88\r\n &. *@8@:+##o&888#@@@, .#+\r\n &. ,@+o,.::+*+*:&#&, ,@.\r\n &. .@8*,. ,*+++.+* :8+\r\n :+ .#@::. .8:.:** .8@@o,\r\n .o. #@+ :@,.&* .:@@@@@@8**.\r\n +&. :@o,+.*o,*, .*@@@@@@@@@@#o\r\n .*:&o. 8@o:,*:, .o@@#8&&@@@@#@@@*\r\n ,*:+:::o.*&8+,++ ,&@@#: * :@@88@@@#:.\r\n ,::**:o:.,&*+*8: *8@@##o *,.8@@#8#@#@#+\r\n *:+*&o8:. ,o,o:8@+o@@88:*@+ +: +#@#####8##&.\r\n ,:&::88&, .&:#o#@@@#,+&&*#&. .:,.&#@#88#####&,\r\n +::o+&8:. :##88@@@@:.:8o+&8&. .. +8###&8&##&88*\r\n .:*+*.8#: ,o*.+&@@#@8,,o8*+8##+ .+#8##8&&#8888:.\r\n ,:o., &#8. .:8*. .o, &#,*:8:+,&*:, .8@@#o&&##8:&#8.\r\n .*o.*,+o8#* +8&, .::. .88.+:8o: ,+:, ,o#@#8&o8##&#8+\r\n +o, .+,,o#8+,8@o**.,o*, :8o +*8#* +&, ,*o@@#@&8&oo8&:,\r\noo*+,,,*8@#..&@8:**:oo+. +8#* *+#@:...oo+ .**:8@@@ooo&:&o##+\r\n::+..,++#@,.:##o&o**,....oo#++#8#@:.,:8&:.....*&@@#:oo*&oo&#@*\r\n .+**:*8@o,+##&o:+,,,+,,o*8#,,8@#@:,,+*o*++,,,,+&#@8*8o88&::*. .,,,,,++,\r\n ..8@++#@#88:,,,.,,,:+#&,,#@@#:,,.,&o*,.+++*:#@8+:*+. ......,:+*&,,.....\r\n +:&8#@@##8&+,,,***@&,.8@@@*,,,.:o8&o&*o&o&o. .,.****::*:o*:o*o+,.\r\n ...,*:*o&&o*8@@&o8@@@8+,,+:&&:+,... ,++*&oo&8&&&oo#@##8#&8:.\r\n o@#@@@@#@@@@@@@,..... ..,,.+*::o#@##@##@#@#########@@@8:,.\r\n ,@##@@88#@@@@@8 .:***oo*#8###8#@#@#@#@####@#@###@@#8&#:\r\n 8+.,8+..,*o#@+ ,o+o88&88###@8#######@8#8#88#8#88##88#&\r\n *o *+ #8 . ,*o&#@##@@@@@@@@@######8#888&&oo:8:\r\n 8, ,& +@* .ooo&#@@@@@#@@@@@@####@##8#8##oo:o&:,\r\n +& &, .@#. .:8#@@@@@@@@@@##8#####8#o&&#8*:8&&8:\r\n o* ,o o@& +o#@@@@@@@@#o&o88:&+ooo&:*::o:o&**o.:*+\r\n .8. 8.,o#8 .+&#@@@@@@@@&o+,::*+*:+:, ,. ,.. .,. ,.\r\n 8. 8.,.&@:*:&@@@@@@@@8o+, ,.\r\n :@o:#,,o8&:o&@@@@#&:+.\r\n .@@@@@@@@@@@#8&o+,\r\n ,*:&#@#&o*,..\r\n```\r\n\r\n- TODO\r\n - [ ] Auto fork repo before push\r\n - [ ] Update origin to push to\r\n - [ ] Create PR\r\n - [ ] Update README to fix demos\r\n - [ ] Update CONTRIBUTING with tutorial on adding\r\n `CONTRIBUTING.md` check and contribution\r\n\r\n**entities/alice/alice/timelines.py**\r\n\r\n```python\r\n\"\"\"\r\nHelpers for the timelines we support\r\n\"\"\"\r\n\r\n# Trinity Day 0\r\nALICE_DAY_0_GREGORIAN = datetime.datetime(2022, 4, 16)\r\n\r\ndef date_alice_from_gregorian(date: str) -> int:\r\n # TODO\r\n return ALICE_DAY_0_GREGORIAN\r\n```\r\n\r\n```diff\r\ndiff --git a/dffml/base.py b/dffml/base.py\r\nindex fea0ef7220..9d6cd886fa 100644\r\n--- a/dffml/base.py\r\n+++ b/dffml/base.py\r\n@@ -237,6 +237,7 @@ def convert_value(arg, value, *, dataclass=None):\r\n # before checking if the value is an instance of that\r\n # type. Since it doesn't make sense to check if the\r\n # value is an instance of something that's not a type.\r\n+ print(possible_type, value)\r\n if isinstance(possible_type, type) and isinstance(\r\n value, possible_type\r\n ):\r\ndiff --git a/dffml/df/system_context/system_context.py b/dffml/df/system_context/system_context.py\r\nindex e055a343f1..063547ad0c 100644\r\n--- a/dffml/df/system_context/system_context.py\r\n+++ b/dffml/df/system_context/system_context.py\r\n@@ -90,11 +90,11 @@ class SystemContextConfig:\r\n # links: 'SystemContextConfig'\r\n overlay: Union[\"SystemContextConfig\", DataFlow] = field(\r\n \"The overlay we will apply with any overlays to merge within it (see default overlay usage docs)\",\r\n- default=APPLY_INSTALLED_OVERLAYS,\r\n+ default=None,\r\n )\r\n orchestrator: Union[\"SystemContextConfig\", BaseOrchestrator] = field(\r\n \"The system context who's default flow will be used to produce an orchestrator which will be used to execute this system context including application of overlays\",\r\n- default_factory=lambda: MemoryOrchestrator,\r\n+ default=None,\r\n )\r\n \r\n \r\n@@ -131,6 +131,7 @@ class SystemContext(BaseDataFlowFacilitatorObject):\r\n )\r\n # TODO(alice) Apply overlay\r\n if self.config.overlay not in (None, APPLY_INSTALLED_OVERLAYS):\r\n+ print(self.config.overlay)\r\n breakpoint()\r\n raise NotImplementedError(\r\n \"Application of overlays within SystemContext class entry not yet supported\"\r\ndiff --git a/dffml/high_level/dataflow.py b/dffml/high_level/dataflow.py\r\nindex d180b5c302..d595ae1cb4 100644\r\n--- a/dffml/high_level/dataflow.py\r\n+++ b/dffml/high_level/dataflow.py\r\n@@ -206,12 +206,25 @@ async def run(\r\n # the of the one that got passed in and the overlay.\r\n if inspect.isclass(overlay):\r\n overlay = overlay()\r\n+ # TODO Move this into Overlay.load. Create a system context to\r\n+ # execute the overlay if it is not already.\r\n+ known_overlay_types = (DataFlow, SystemContext)\r\n+ if not isinstance(overlay, known_overlay_types):\r\n+ raise NotImplementedError(f\"{overlay} is not a known type {known_overlay_types}\")\r\n+ if isinstance(overlay, DataFlow):\r\n+ overlay = SystemContext(\r\n+ upstream=overlay,\r\n+ )\r\n # TODO(alice) overlay.deployment(\"native.python.overlay.apply\")\r\n apply_overlay = overlay.deployment()\r\n async for _ctx, result in apply_overlay(\r\n dataflow=dataflow,\r\n ):\r\n+ print(\"FEEDFACE\", _ctx, result)\r\n+ breakpoint()\r\n+ return\r\n continue\r\n+\r\n # TODO\r\n resultant_system_context = SystemContext(\r\n upstream=result[\"overlays_merged\"], overlay=None,\r\ndiff --git a/dffml/overlay/overlay.py b/dffml/overlay/overlay.py\r\nindex 13a50d9c10..0a01d38de9 100644\r\n--- a/dffml/overlay/overlay.py\r\n+++ b/dffml/overlay/overlay.py\r\n@@ -124,7 +124,7 @@ DFFML_MAIN_PACKAGE_OVERLAY = DataFlow(\r\n stage=Stage.OUTPUT,\r\n inputs={\r\n \"merged\": DataFlowAfterOverlaysMerged,\r\n- \"dataflow_we_are_applying_overlays_to_by_running_overlay_dataflow_and_passing_as_an_input\": DataFlowWeAreApplyingOverlaysToByRunningOverlayDataflowAndPassingAsAnInput,\r\n+ \"upstream\": DataFlowWeAreApplyingOverlaysToByRunningOverlayDataflowAndPassingAsAnInput,\r\n },\r\n outputs={\"overlayed\": DataFlowAfterOverlaysApplied,},\r\n multi_output=False,\r\n@@ -208,15 +208,12 @@ merge_implementations(\r\n DFFML_OVERLAYS_INSTALLED.update(auto_flow=True)\r\n \r\n # Create Class for calling operations within the System Context as methods\r\n-DFFMLOverlaysInstalled = SystemContext.subclass(\r\n- \"DFFMLOverlaysInstalled\",\r\n- {\r\n- \"upstream\": {\"default_factory\": lambda: DFFML_OVERLAYS_INSTALLED},\r\n- # TODO(alice) We'll need to make sure we have code to instantiate and\r\n- # instance of a class if only a class is given an not an instance.\r\n- \"overlay\": {\"default_factory\": lambda: None},\r\n- \"orchestrator\": {\"default_factory\": lambda: MemoryOrchestrator()},\r\n- },\r\n+DFFMLOverlaysInstalled = SystemContext(\r\n+ upstream=DFFML_OVERLAYS_INSTALLED,\r\n+ # TODO(alice) We'll need to make sure we have code to instantiate and\r\n+ # instance of a class if only a class is given an not an instance.\r\n+ overlay=None,\r\n+ orchestrator=MemoryOrchestrator(),\r\n )\r\n \r\n # Callee\r\ndiff --git a/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py b/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py\r\nindex 46d20c8c85..fff5d4928b 100644\r\n--- a/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py\r\n+++ b/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py\r\n@@ -18,6 +18,14 @@ from ....recommended_community_standards import AliceGitRepo, AlicePleaseContrib\r\n from ....dffml.operations.git.contribute import AlicePleaseContributeRecommendedCommunityStandardsOverlayGit\r\n \r\n \r\n+GitHubIssue = NewType(\"GitHubIssue\", str)\r\n+\r\n+\r\n+@dataclasses.dataclass\r\n+class RecommendedCommunityStandardContribution:\r\n+ path: pathlib.Path\r\n+ issue: GitHubIssue\r\n+\r\n \r\n class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n \"\"\"\r\n@@ -39,6 +47,7 @@ class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n MetaIssueTitle = NewType(\"MetaIssueTitle\", str)\r\n MetaIssueBody = NewType(\"MetaIssueBody\", str)\r\n \r\n+ # TODO This should only be run if there is a need for a README\r\n # body: Optional['ContributingIssueBody'] = \"References:\\n- https://docs.github.com/articles/setting-guidelines-for-repository-contributors/\",\r\n async def readme_issue(\r\n self,\r\n@@ -79,13 +88,40 @@ class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n \"\"\"\r\n ).lstrip()\r\n \r\n- # TODO(alice) There is a bug with Optional which can be revield by use here\r\n+\r\n @staticmethod\r\n+ async def readme_contribution(\r\n+ issue: \"ReadmeIssue\",\r\n+ path: AlicePleaseContributeRecommendedCommunityStandards.ReadmePath,\r\n+ ) -> RecommendedCommunityStandardContribution:\r\n+ return RecommendedCommunityStandardContribution(\r\n+ path=path,\r\n+ issue=issue,\r\n+ )\r\n+\r\n+\r\n+ \"\"\"\r\n+ @dffml.op(\r\n+ stage=dffml.Stage.OUTPUT,\r\n+ )\r\n+ async def collect_recommended_community_standard_contributions(\r\n+ self,\r\n+ ) -> List[RecommendedCommunityStandardContribution]:\r\n+ async with self.octx.ictx.definitions(self.ctx) as od:\r\n+ return [item async for item in od.inputs(RecommendedCommunityStandardContribution)]\r\n+ \"\"\"\r\n+\r\n+\r\n+ # TODO(alice) There is a bug with Optional which can be revield by use here\r\n def meta_issue_body(\r\n repo: AliceGitRepo,\r\n base: AlicePleaseContributeRecommendedCommunityStandardsOverlayGit.BaseBranch,\r\n- readme_path: AlicePleaseContributeRecommendedCommunityStandards.ReadmePath,\r\n- readme_issue: ReadmeIssue,\r\n+ # recommended_community_standard_contributions: List[RecommendedCommunityStandardContribution],\r\n+ # TODO On @op inspect paramter if Collect is found on an input, wrap the\r\n+ # operation in a subflow and add a generic version of\r\n+ # collect_recommended_community_standard_contributions to the flow as an\r\n+ # autostart or triggered via auto start operation.\r\n+ # recommended_community_standard_contributions: Collect[List[RecommendedCommunityStandardContribution]],\r\n ) -> \"MetaIssueBody\":\r\n \"\"\"\r\n >>> AlicePleaseContributeRecommendedCommunityStandardsGitHubIssueOverlay.meta_issue_body(\r\n@@ -98,6 +134,7 @@ class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n - [] [License](https://github.com/intel/dffml/blob/main/LICENSE)\r\n - [] Security\r\n \"\"\"\r\n+ readme_issue, readme_path = recommended_community_standard_contributions[0]\r\n return \"\\n\".join(\r\n [\r\n \"- [\"\r\n```",
"editedAt": "2022-07-28T06:08:34Z"
},
{
"diff": "## 2022-07-27 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n\r\n### Refactoring and Thinking About Locking of Repos for Contributions\r\n\r\n- Metadata\r\n - Date: 2022-07-27 20:30 UTC -7\r\n- Saving this diff which was some work on dynamic application of overlay\r\n so as to support fixup of the OpImp for `meta_issue_body()`'s inputs.\r\n - We are going to table this for now for time reasons, but if someone\r\n wants to pick it up before @pdxjohnny is back in September, please\r\n give it a go (create an issue).\r\n- Noticed that we have an issue with adding new files and locking. The current\r\n lock is on the `git_repository/GitRepoSpec`.\r\n - We then convert to `AliceGitRepo`, at which point anything take `AliceGitRepo`\r\n- `alice version`\r\n - https://free-images.com/display/de_alices_abenteuer_im_43.html\r\n - https://github.com/KhorSL/ASCII-ART\r\n - ![alice looking up](https://user-images.githubusercontent.com/5950433/181430120-31e99c0b-4f52-422f-b6c9-b69b2c693a0d.png)\r\n\r\n- TODO\r\n - [ ] Auto fork repo before push\r\n - [ ] Update origin to push to\r\n - [ ] Create PR\r\n - [ ] Update README to fix demos\r\n - [ ] Update CONTRIBUTING with tutorial on adding\r\n `CONTRIBUTING.md` check and contribution\r\n\r\n**entities/alice/alice/timelines.py**\r\n\r\n```python\r\n\"\"\"\r\nHelpers for the timelines we support\r\n\"\"\"\r\n\r\n# Trinity Day 0\r\nALICE_DAY_0_GREGORIAN = datetime.datetime(2022, 4, 16)\r\n\r\ndef date_alice_from_gregorian(date: str) -> int:\r\n # TODO\r\n return ALICE_DAY_0_GREGORIAN\r\n```\r\n\r\n```diff\r\ndiff --git a/dffml/base.py b/dffml/base.py\r\nindex fea0ef7220..9d6cd886fa 100644\r\n--- a/dffml/base.py\r\n+++ b/dffml/base.py\r\n@@ -237,6 +237,7 @@ def convert_value(arg, value, *, dataclass=None):\r\n # before checking if the value is an instance of that\r\n # type. Since it doesn't make sense to check if the\r\n # value is an instance of something that's not a type.\r\n+ print(possible_type, value)\r\n if isinstance(possible_type, type) and isinstance(\r\n value, possible_type\r\n ):\r\ndiff --git a/dffml/df/system_context/system_context.py b/dffml/df/system_context/system_context.py\r\nindex e055a343f1..063547ad0c 100644\r\n--- a/dffml/df/system_context/system_context.py\r\n+++ b/dffml/df/system_context/system_context.py\r\n@@ -90,11 +90,11 @@ class SystemContextConfig:\r\n # links: 'SystemContextConfig'\r\n overlay: Union[\"SystemContextConfig\", DataFlow] = field(\r\n \"The overlay we will apply with any overlays to merge within it (see default overlay usage docs)\",\r\n- default=APPLY_INSTALLED_OVERLAYS,\r\n+ default=None,\r\n )\r\n orchestrator: Union[\"SystemContextConfig\", BaseOrchestrator] = field(\r\n \"The system context who's default flow will be used to produce an orchestrator which will be used to execute this system context including application of overlays\",\r\n- default_factory=lambda: MemoryOrchestrator,\r\n+ default=None,\r\n )\r\n \r\n \r\n@@ -131,6 +131,7 @@ class SystemContext(BaseDataFlowFacilitatorObject):\r\n )\r\n # TODO(alice) Apply overlay\r\n if self.config.overlay not in (None, APPLY_INSTALLED_OVERLAYS):\r\n+ print(self.config.overlay)\r\n breakpoint()\r\n raise NotImplementedError(\r\n \"Application of overlays within SystemContext class entry not yet supported\"\r\ndiff --git a/dffml/high_level/dataflow.py b/dffml/high_level/dataflow.py\r\nindex d180b5c302..d595ae1cb4 100644\r\n--- a/dffml/high_level/dataflow.py\r\n+++ b/dffml/high_level/dataflow.py\r\n@@ -206,12 +206,25 @@ async def run(\r\n # the of the one that got passed in and the overlay.\r\n if inspect.isclass(overlay):\r\n overlay = overlay()\r\n+ # TODO Move this into Overlay.load. Create a system context to\r\n+ # execute the overlay if it is not already.\r\n+ known_overlay_types = (DataFlow, SystemContext)\r\n+ if not isinstance(overlay, known_overlay_types):\r\n+ raise NotImplementedError(f\"{overlay} is not a known type {known_overlay_types}\")\r\n+ if isinstance(overlay, DataFlow):\r\n+ overlay = SystemContext(\r\n+ upstream=overlay,\r\n+ )\r\n # TODO(alice) overlay.deployment(\"native.python.overlay.apply\")\r\n apply_overlay = overlay.deployment()\r\n async for _ctx, result in apply_overlay(\r\n dataflow=dataflow,\r\n ):\r\n+ print(\"FEEDFACE\", _ctx, result)\r\n+ breakpoint()\r\n+ return\r\n continue\r\n+\r\n # TODO\r\n resultant_system_context = SystemContext(\r\n upstream=result[\"overlays_merged\"], overlay=None,\r\ndiff --git a/dffml/overlay/overlay.py b/dffml/overlay/overlay.py\r\nindex 13a50d9c10..0a01d38de9 100644\r\n--- a/dffml/overlay/overlay.py\r\n+++ b/dffml/overlay/overlay.py\r\n@@ -124,7 +124,7 @@ DFFML_MAIN_PACKAGE_OVERLAY = DataFlow(\r\n stage=Stage.OUTPUT,\r\n inputs={\r\n \"merged\": DataFlowAfterOverlaysMerged,\r\n- \"dataflow_we_are_applying_overlays_to_by_running_overlay_dataflow_and_passing_as_an_input\": DataFlowWeAreApplyingOverlaysToByRunningOverlayDataflowAndPassingAsAnInput,\r\n+ \"upstream\": DataFlowWeAreApplyingOverlaysToByRunningOverlayDataflowAndPassingAsAnInput,\r\n },\r\n outputs={\"overlayed\": DataFlowAfterOverlaysApplied,},\r\n multi_output=False,\r\n@@ -208,15 +208,12 @@ merge_implementations(\r\n DFFML_OVERLAYS_INSTALLED.update(auto_flow=True)\r\n \r\n # Create Class for calling operations within the System Context as methods\r\n-DFFMLOverlaysInstalled = SystemContext.subclass(\r\n- \"DFFMLOverlaysInstalled\",\r\n- {\r\n- \"upstream\": {\"default_factory\": lambda: DFFML_OVERLAYS_INSTALLED},\r\n- # TODO(alice) We'll need to make sure we have code to instantiate and\r\n- # instance of a class if only a class is given an not an instance.\r\n- \"overlay\": {\"default_factory\": lambda: None},\r\n- \"orchestrator\": {\"default_factory\": lambda: MemoryOrchestrator()},\r\n- },\r\n+DFFMLOverlaysInstalled = SystemContext(\r\n+ upstream=DFFML_OVERLAYS_INSTALLED,\r\n+ # TODO(alice) We'll need to make sure we have code to instantiate and\r\n+ # instance of a class if only a class is given an not an instance.\r\n+ overlay=None,\r\n+ orchestrator=MemoryOrchestrator(),\r\n )\r\n \r\n # Callee\r\ndiff --git a/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py b/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py\r\nindex 46d20c8c85..fff5d4928b 100644\r\n--- a/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py\r\n+++ b/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py\r\n@@ -18,6 +18,14 @@ from ....recommended_community_standards import AliceGitRepo, AlicePleaseContrib\r\n from ....dffml.operations.git.contribute import AlicePleaseContributeRecommendedCommunityStandardsOverlayGit\r\n \r\n \r\n+GitHubIssue = NewType(\"GitHubIssue\", str)\r\n+\r\n+\r\n+@dataclasses.dataclass\r\n+class RecommendedCommunityStandardContribution:\r\n+ path: pathlib.Path\r\n+ issue: GitHubIssue\r\n+\r\n \r\n class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n \"\"\"\r\n@@ -39,6 +47,7 @@ class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n MetaIssueTitle = NewType(\"MetaIssueTitle\", str)\r\n MetaIssueBody = NewType(\"MetaIssueBody\", str)\r\n \r\n+ # TODO This should only be run if there is a need for a README\r\n # body: Optional['ContributingIssueBody'] = \"References:\\n- https://docs.github.com/articles/setting-guidelines-for-repository-contributors/\",\r\n async def readme_issue(\r\n self,\r\n@@ -79,13 +88,40 @@ class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n \"\"\"\r\n ).lstrip()\r\n \r\n- # TODO(alice) There is a bug with Optional which can be revield by use here\r\n+\r\n @staticmethod\r\n+ async def readme_contribution(\r\n+ issue: \"ReadmeIssue\",\r\n+ path: AlicePleaseContributeRecommendedCommunityStandards.ReadmePath,\r\n+ ) -> RecommendedCommunityStandardContribution:\r\n+ return RecommendedCommunityStandardContribution(\r\n+ path=path,\r\n+ issue=issue,\r\n+ )\r\n+\r\n+\r\n+ \"\"\"\r\n+ @dffml.op(\r\n+ stage=dffml.Stage.OUTPUT,\r\n+ )\r\n+ async def collect_recommended_community_standard_contributions(\r\n+ self,\r\n+ ) -> List[RecommendedCommunityStandardContribution]:\r\n+ async with self.octx.ictx.definitions(self.ctx) as od:\r\n+ return [item async for item in od.inputs(RecommendedCommunityStandardContribution)]\r\n+ \"\"\"\r\n+\r\n+\r\n+ # TODO(alice) There is a bug with Optional which can be revield by use here\r\n def meta_issue_body(\r\n repo: AliceGitRepo,\r\n base: AlicePleaseContributeRecommendedCommunityStandardsOverlayGit.BaseBranch,\r\n- readme_path: AlicePleaseContributeRecommendedCommunityStandards.ReadmePath,\r\n- readme_issue: ReadmeIssue,\r\n+ # recommended_community_standard_contributions: List[RecommendedCommunityStandardContribution],\r\n+ # TODO On @op inspect paramter if Collect is found on an input, wrap the\r\n+ # operation in a subflow and add a generic version of\r\n+ # collect_recommended_community_standard_contributions to the flow as an\r\n+ # autostart or triggered via auto start operation.\r\n+ # recommended_community_standard_contributions: Collect[List[RecommendedCommunityStandardContribution]],\r\n ) -> \"MetaIssueBody\":\r\n \"\"\"\r\n >>> AlicePleaseContributeRecommendedCommunityStandardsGitHubIssueOverlay.meta_issue_body(\r\n@@ -98,6 +134,7 @@ class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n - [] [License](https://github.com/intel/dffml/blob/main/LICENSE)\r\n - [] Security\r\n \"\"\"\r\n+ readme_issue, readme_path = recommended_community_standard_contributions[0]\r\n return \"\\n\".join(\r\n [\r\n \"- [\"\r\n```",
"editedAt": "2022-07-28T05:51:37Z"
},
{
"diff": "## 2022-07-27 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n\r\n### Refactoring and Thinking About Locking of Repos for Contributions\r\n\r\n- Metadata\r\n - Date: 2022-07-27 20:30 UTC -7\r\n- Saving this diff which was some work on dynamic application of overlay\r\n so as to support fixup of the OpImp for `meta_issue_body()`'s inputs.\r\n - We are going to table this for now for time reasons, but if someone\r\n wants to pick it up before @pdxjohnny is back in September, please\r\n give it a go (create an issue).\r\n- Noticed that we have an issue with adding new files and locking. The current\r\n lock is on the `git_repository/GitRepoSpec`.\r\n - We then convert to `AliceGitRepo`, at which point anything take `AliceGitRepo`\r\n- https://free-images.com/display/de_alices_abenteuer_im_43.html\r\n - ![alice looking up](https://user-images.githubusercontent.com/5950433/181430120-31e99c0b-4f52-422f-b6c9-b69b2c693a0d.png)\r\n\r\n- TODO\r\n - [ ] Auto fork repo before push\r\n - [ ] Update origin to push to\r\n - [ ] Create PR\r\n\r\n**entities/alice/alice/timelines.py**\r\n\r\n```python\r\n\"\"\"\r\nHelpers for the timelines we support\r\n\"\"\"\r\n\r\n# Trinity Day 0\r\nALICE_DAY_0_GREGORIAN = datetime.datetime(2022, 4, 16)\r\n\r\ndef date_alice_from_gregorian(date: str) -> int:\r\n # TODO\r\n return ALICE_DAY_0_GREGORIAN\r\n```\r\n\r\n```diff\r\ndiff --git a/dffml/base.py b/dffml/base.py\r\nindex fea0ef7220..9d6cd886fa 100644\r\n--- a/dffml/base.py\r\n+++ b/dffml/base.py\r\n@@ -237,6 +237,7 @@ def convert_value(arg, value, *, dataclass=None):\r\n # before checking if the value is an instance of that\r\n # type. Since it doesn't make sense to check if the\r\n # value is an instance of something that's not a type.\r\n+ print(possible_type, value)\r\n if isinstance(possible_type, type) and isinstance(\r\n value, possible_type\r\n ):\r\ndiff --git a/dffml/df/system_context/system_context.py b/dffml/df/system_context/system_context.py\r\nindex e055a343f1..063547ad0c 100644\r\n--- a/dffml/df/system_context/system_context.py\r\n+++ b/dffml/df/system_context/system_context.py\r\n@@ -90,11 +90,11 @@ class SystemContextConfig:\r\n # links: 'SystemContextConfig'\r\n overlay: Union[\"SystemContextConfig\", DataFlow] = field(\r\n \"The overlay we will apply with any overlays to merge within it (see default overlay usage docs)\",\r\n- default=APPLY_INSTALLED_OVERLAYS,\r\n+ default=None,\r\n )\r\n orchestrator: Union[\"SystemContextConfig\", BaseOrchestrator] = field(\r\n \"The system context who's default flow will be used to produce an orchestrator which will be used to execute this system context including application of overlays\",\r\n- default_factory=lambda: MemoryOrchestrator,\r\n+ default=None,\r\n )\r\n \r\n \r\n@@ -131,6 +131,7 @@ class SystemContext(BaseDataFlowFacilitatorObject):\r\n )\r\n # TODO(alice) Apply overlay\r\n if self.config.overlay not in (None, APPLY_INSTALLED_OVERLAYS):\r\n+ print(self.config.overlay)\r\n breakpoint()\r\n raise NotImplementedError(\r\n \"Application of overlays within SystemContext class entry not yet supported\"\r\ndiff --git a/dffml/high_level/dataflow.py b/dffml/high_level/dataflow.py\r\nindex d180b5c302..d595ae1cb4 100644\r\n--- a/dffml/high_level/dataflow.py\r\n+++ b/dffml/high_level/dataflow.py\r\n@@ -206,12 +206,25 @@ async def run(\r\n # the of the one that got passed in and the overlay.\r\n if inspect.isclass(overlay):\r\n overlay = overlay()\r\n+ # TODO Move this into Overlay.load. Create a system context to\r\n+ # execute the overlay if it is not already.\r\n+ known_overlay_types = (DataFlow, SystemContext)\r\n+ if not isinstance(overlay, known_overlay_types):\r\n+ raise NotImplementedError(f\"{overlay} is not a known type {known_overlay_types}\")\r\n+ if isinstance(overlay, DataFlow):\r\n+ overlay = SystemContext(\r\n+ upstream=overlay,\r\n+ )\r\n # TODO(alice) overlay.deployment(\"native.python.overlay.apply\")\r\n apply_overlay = overlay.deployment()\r\n async for _ctx, result in apply_overlay(\r\n dataflow=dataflow,\r\n ):\r\n+ print(\"FEEDFACE\", _ctx, result)\r\n+ breakpoint()\r\n+ return\r\n continue\r\n+\r\n # TODO\r\n resultant_system_context = SystemContext(\r\n upstream=result[\"overlays_merged\"], overlay=None,\r\ndiff --git a/dffml/overlay/overlay.py b/dffml/overlay/overlay.py\r\nindex 13a50d9c10..0a01d38de9 100644\r\n--- a/dffml/overlay/overlay.py\r\n+++ b/dffml/overlay/overlay.py\r\n@@ -124,7 +124,7 @@ DFFML_MAIN_PACKAGE_OVERLAY = DataFlow(\r\n stage=Stage.OUTPUT,\r\n inputs={\r\n \"merged\": DataFlowAfterOverlaysMerged,\r\n- \"dataflow_we_are_applying_overlays_to_by_running_overlay_dataflow_and_passing_as_an_input\": DataFlowWeAreApplyingOverlaysToByRunningOverlayDataflowAndPassingAsAnInput,\r\n+ \"upstream\": DataFlowWeAreApplyingOverlaysToByRunningOverlayDataflowAndPassingAsAnInput,\r\n },\r\n outputs={\"overlayed\": DataFlowAfterOverlaysApplied,},\r\n multi_output=False,\r\n@@ -208,15 +208,12 @@ merge_implementations(\r\n DFFML_OVERLAYS_INSTALLED.update(auto_flow=True)\r\n \r\n # Create Class for calling operations within the System Context as methods\r\n-DFFMLOverlaysInstalled = SystemContext.subclass(\r\n- \"DFFMLOverlaysInstalled\",\r\n- {\r\n- \"upstream\": {\"default_factory\": lambda: DFFML_OVERLAYS_INSTALLED},\r\n- # TODO(alice) We'll need to make sure we have code to instantiate and\r\n- # instance of a class if only a class is given an not an instance.\r\n- \"overlay\": {\"default_factory\": lambda: None},\r\n- \"orchestrator\": {\"default_factory\": lambda: MemoryOrchestrator()},\r\n- },\r\n+DFFMLOverlaysInstalled = SystemContext(\r\n+ upstream=DFFML_OVERLAYS_INSTALLED,\r\n+ # TODO(alice) We'll need to make sure we have code to instantiate and\r\n+ # instance of a class if only a class is given an not an instance.\r\n+ overlay=None,\r\n+ orchestrator=MemoryOrchestrator(),\r\n )\r\n \r\n # Callee\r\ndiff --git a/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py b/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py\r\nindex 46d20c8c85..fff5d4928b 100644\r\n--- a/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py\r\n+++ b/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py\r\n@@ -18,6 +18,14 @@ from ....recommended_community_standards import AliceGitRepo, AlicePleaseContrib\r\n from ....dffml.operations.git.contribute import AlicePleaseContributeRecommendedCommunityStandardsOverlayGit\r\n \r\n \r\n+GitHubIssue = NewType(\"GitHubIssue\", str)\r\n+\r\n+\r\n+@dataclasses.dataclass\r\n+class RecommendedCommunityStandardContribution:\r\n+ path: pathlib.Path\r\n+ issue: GitHubIssue\r\n+\r\n \r\n class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n \"\"\"\r\n@@ -39,6 +47,7 @@ class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n MetaIssueTitle = NewType(\"MetaIssueTitle\", str)\r\n MetaIssueBody = NewType(\"MetaIssueBody\", str)\r\n \r\n+ # TODO This should only be run if there is a need for a README\r\n # body: Optional['ContributingIssueBody'] = \"References:\\n- https://docs.github.com/articles/setting-guidelines-for-repository-contributors/\",\r\n async def readme_issue(\r\n self,\r\n@@ -79,13 +88,40 @@ class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n \"\"\"\r\n ).lstrip()\r\n \r\n- # TODO(alice) There is a bug with Optional which can be revield by use here\r\n+\r\n @staticmethod\r\n+ async def readme_contribution(\r\n+ issue: \"ReadmeIssue\",\r\n+ path: AlicePleaseContributeRecommendedCommunityStandards.ReadmePath,\r\n+ ) -> RecommendedCommunityStandardContribution:\r\n+ return RecommendedCommunityStandardContribution(\r\n+ path=path,\r\n+ issue=issue,\r\n+ )\r\n+\r\n+\r\n+ \"\"\"\r\n+ @dffml.op(\r\n+ stage=dffml.Stage.OUTPUT,\r\n+ )\r\n+ async def collect_recommended_community_standard_contributions(\r\n+ self,\r\n+ ) -> List[RecommendedCommunityStandardContribution]:\r\n+ async with self.octx.ictx.definitions(self.ctx) as od:\r\n+ return [item async for item in od.inputs(RecommendedCommunityStandardContribution)]\r\n+ \"\"\"\r\n+\r\n+\r\n+ # TODO(alice) There is a bug with Optional which can be revield by use here\r\n def meta_issue_body(\r\n repo: AliceGitRepo,\r\n base: AlicePleaseContributeRecommendedCommunityStandardsOverlayGit.BaseBranch,\r\n- readme_path: AlicePleaseContributeRecommendedCommunityStandards.ReadmePath,\r\n- readme_issue: ReadmeIssue,\r\n+ # recommended_community_standard_contributions: List[RecommendedCommunityStandardContribution],\r\n+ # TODO On @op inspect paramter if Collect is found on an input, wrap the\r\n+ # operation in a subflow and add a generic version of\r\n+ # collect_recommended_community_standard_contributions to the flow as an\r\n+ # autostart or triggered via auto start operation.\r\n+ # recommended_community_standard_contributions: Collect[List[RecommendedCommunityStandardContribution]],\r\n ) -> \"MetaIssueBody\":\r\n \"\"\"\r\n >>> AlicePleaseContributeRecommendedCommunityStandardsGitHubIssueOverlay.meta_issue_body(\r\n@@ -98,6 +134,7 @@ class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n - [] [License](https://github.com/intel/dffml/blob/main/LICENSE)\r\n - [] Security\r\n \"\"\"\r\n+ readme_issue, readme_path = recommended_community_standard_contributions[0]\r\n return \"\\n\".join(\r\n [\r\n \"- [\"\r\n```",
"editedAt": "2022-07-28T05:49:05Z"
},
{
"diff": "## 2022-07-27 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n\r\n### Refactoring and Thinking About Locking of Repos for Contributions\r\n\r\n- Metadata\r\n - Date: 2022-07-27 20:30 UTC -7\r\n- Saving this diff which was some work on dynamic application of overlay\r\n so as to support fixup of the OpImp for `meta_issue_body()`'s inputs.\r\n - We are going to table this for now for time reasons, but if someone\r\n wants to pick it up before @pdxjohnny is back in September, please\r\n give it a go (create an issue).\r\n- Noticed that we have an issue with adding new files and locking. The current\r\n lock is on the `git_repository/GitRepoSpec`.\r\n - We then convert to `AliceGitRepo`, at which point anything take `AliceGitRepo`\r\n- TODO\r\n - [ ] Auto fork repo before push\r\n - [ ] Update origin to push to\r\n - [ ] Create PR\r\n\r\n**entities/alice/alice/timelines.py**\r\n\r\n```python\r\n\"\"\"\r\nHelpers for the timelines we support\r\n\"\"\"\r\n\r\n# Trinity Day 0\r\nALICE_DAY_0_GREGORIAN = datetime.datetime(2022, 4, 16)\r\n\r\ndef date_alice_from_gregorian(date: str) -> int:\r\n # TODO\r\n return ALICE_DAY_0_GREGORIAN\r\n```\r\n\r\n```diff\r\ndiff --git a/dffml/base.py b/dffml/base.py\r\nindex fea0ef7220..9d6cd886fa 100644\r\n--- a/dffml/base.py\r\n+++ b/dffml/base.py\r\n@@ -237,6 +237,7 @@ def convert_value(arg, value, *, dataclass=None):\r\n # before checking if the value is an instance of that\r\n # type. Since it doesn't make sense to check if the\r\n # value is an instance of something that's not a type.\r\n+ print(possible_type, value)\r\n if isinstance(possible_type, type) and isinstance(\r\n value, possible_type\r\n ):\r\ndiff --git a/dffml/df/system_context/system_context.py b/dffml/df/system_context/system_context.py\r\nindex e055a343f1..063547ad0c 100644\r\n--- a/dffml/df/system_context/system_context.py\r\n+++ b/dffml/df/system_context/system_context.py\r\n@@ -90,11 +90,11 @@ class SystemContextConfig:\r\n # links: 'SystemContextConfig'\r\n overlay: Union[\"SystemContextConfig\", DataFlow] = field(\r\n \"The overlay we will apply with any overlays to merge within it (see default overlay usage docs)\",\r\n- default=APPLY_INSTALLED_OVERLAYS,\r\n+ default=None,\r\n )\r\n orchestrator: Union[\"SystemContextConfig\", BaseOrchestrator] = field(\r\n \"The system context who's default flow will be used to produce an orchestrator which will be used to execute this system context including application of overlays\",\r\n- default_factory=lambda: MemoryOrchestrator,\r\n+ default=None,\r\n )\r\n \r\n \r\n@@ -131,6 +131,7 @@ class SystemContext(BaseDataFlowFacilitatorObject):\r\n )\r\n # TODO(alice) Apply overlay\r\n if self.config.overlay not in (None, APPLY_INSTALLED_OVERLAYS):\r\n+ print(self.config.overlay)\r\n breakpoint()\r\n raise NotImplementedError(\r\n \"Application of overlays within SystemContext class entry not yet supported\"\r\ndiff --git a/dffml/high_level/dataflow.py b/dffml/high_level/dataflow.py\r\nindex d180b5c302..d595ae1cb4 100644\r\n--- a/dffml/high_level/dataflow.py\r\n+++ b/dffml/high_level/dataflow.py\r\n@@ -206,12 +206,25 @@ async def run(\r\n # the of the one that got passed in and the overlay.\r\n if inspect.isclass(overlay):\r\n overlay = overlay()\r\n+ # TODO Move this into Overlay.load. Create a system context to\r\n+ # execute the overlay if it is not already.\r\n+ known_overlay_types = (DataFlow, SystemContext)\r\n+ if not isinstance(overlay, known_overlay_types):\r\n+ raise NotImplementedError(f\"{overlay} is not a known type {known_overlay_types}\")\r\n+ if isinstance(overlay, DataFlow):\r\n+ overlay = SystemContext(\r\n+ upstream=overlay,\r\n+ )\r\n # TODO(alice) overlay.deployment(\"native.python.overlay.apply\")\r\n apply_overlay = overlay.deployment()\r\n async for _ctx, result in apply_overlay(\r\n dataflow=dataflow,\r\n ):\r\n+ print(\"FEEDFACE\", _ctx, result)\r\n+ breakpoint()\r\n+ return\r\n continue\r\n+\r\n # TODO\r\n resultant_system_context = SystemContext(\r\n upstream=result[\"overlays_merged\"], overlay=None,\r\ndiff --git a/dffml/overlay/overlay.py b/dffml/overlay/overlay.py\r\nindex 13a50d9c10..0a01d38de9 100644\r\n--- a/dffml/overlay/overlay.py\r\n+++ b/dffml/overlay/overlay.py\r\n@@ -124,7 +124,7 @@ DFFML_MAIN_PACKAGE_OVERLAY = DataFlow(\r\n stage=Stage.OUTPUT,\r\n inputs={\r\n \"merged\": DataFlowAfterOverlaysMerged,\r\n- \"dataflow_we_are_applying_overlays_to_by_running_overlay_dataflow_and_passing_as_an_input\": DataFlowWeAreApplyingOverlaysToByRunningOverlayDataflowAndPassingAsAnInput,\r\n+ \"upstream\": DataFlowWeAreApplyingOverlaysToByRunningOverlayDataflowAndPassingAsAnInput,\r\n },\r\n outputs={\"overlayed\": DataFlowAfterOverlaysApplied,},\r\n multi_output=False,\r\n@@ -208,15 +208,12 @@ merge_implementations(\r\n DFFML_OVERLAYS_INSTALLED.update(auto_flow=True)\r\n \r\n # Create Class for calling operations within the System Context as methods\r\n-DFFMLOverlaysInstalled = SystemContext.subclass(\r\n- \"DFFMLOverlaysInstalled\",\r\n- {\r\n- \"upstream\": {\"default_factory\": lambda: DFFML_OVERLAYS_INSTALLED},\r\n- # TODO(alice) We'll need to make sure we have code to instantiate and\r\n- # instance of a class if only a class is given an not an instance.\r\n- \"overlay\": {\"default_factory\": lambda: None},\r\n- \"orchestrator\": {\"default_factory\": lambda: MemoryOrchestrator()},\r\n- },\r\n+DFFMLOverlaysInstalled = SystemContext(\r\n+ upstream=DFFML_OVERLAYS_INSTALLED,\r\n+ # TODO(alice) We'll need to make sure we have code to instantiate and\r\n+ # instance of a class if only a class is given an not an instance.\r\n+ overlay=None,\r\n+ orchestrator=MemoryOrchestrator(),\r\n )\r\n \r\n # Callee\r\ndiff --git a/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py b/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py\r\nindex 46d20c8c85..fff5d4928b 100644\r\n--- a/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py\r\n+++ b/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py\r\n@@ -18,6 +18,14 @@ from ....recommended_community_standards import AliceGitRepo, AlicePleaseContrib\r\n from ....dffml.operations.git.contribute import AlicePleaseContributeRecommendedCommunityStandardsOverlayGit\r\n \r\n \r\n+GitHubIssue = NewType(\"GitHubIssue\", str)\r\n+\r\n+\r\n+@dataclasses.dataclass\r\n+class RecommendedCommunityStandardContribution:\r\n+ path: pathlib.Path\r\n+ issue: GitHubIssue\r\n+\r\n \r\n class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n \"\"\"\r\n@@ -39,6 +47,7 @@ class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n MetaIssueTitle = NewType(\"MetaIssueTitle\", str)\r\n MetaIssueBody = NewType(\"MetaIssueBody\", str)\r\n \r\n+ # TODO This should only be run if there is a need for a README\r\n # body: Optional['ContributingIssueBody'] = \"References:\\n- https://docs.github.com/articles/setting-guidelines-for-repository-contributors/\",\r\n async def readme_issue(\r\n self,\r\n@@ -79,13 +88,40 @@ class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n \"\"\"\r\n ).lstrip()\r\n \r\n- # TODO(alice) There is a bug with Optional which can be revield by use here\r\n+\r\n @staticmethod\r\n+ async def readme_contribution(\r\n+ issue: \"ReadmeIssue\",\r\n+ path: AlicePleaseContributeRecommendedCommunityStandards.ReadmePath,\r\n+ ) -> RecommendedCommunityStandardContribution:\r\n+ return RecommendedCommunityStandardContribution(\r\n+ path=path,\r\n+ issue=issue,\r\n+ )\r\n+\r\n+\r\n+ \"\"\"\r\n+ @dffml.op(\r\n+ stage=dffml.Stage.OUTPUT,\r\n+ )\r\n+ async def collect_recommended_community_standard_contributions(\r\n+ self,\r\n+ ) -> List[RecommendedCommunityStandardContribution]:\r\n+ async with self.octx.ictx.definitions(self.ctx) as od:\r\n+ return [item async for item in od.inputs(RecommendedCommunityStandardContribution)]\r\n+ \"\"\"\r\n+\r\n+\r\n+ # TODO(alice) There is a bug with Optional which can be revield by use here\r\n def meta_issue_body(\r\n repo: AliceGitRepo,\r\n base: AlicePleaseContributeRecommendedCommunityStandardsOverlayGit.BaseBranch,\r\n- readme_path: AlicePleaseContributeRecommendedCommunityStandards.ReadmePath,\r\n- readme_issue: ReadmeIssue,\r\n+ # recommended_community_standard_contributions: List[RecommendedCommunityStandardContribution],\r\n+ # TODO On @op inspect paramter if Collect is found on an input, wrap the\r\n+ # operation in a subflow and add a generic version of\r\n+ # collect_recommended_community_standard_contributions to the flow as an\r\n+ # autostart or triggered via auto start operation.\r\n+ # recommended_community_standard_contributions: Collect[List[RecommendedCommunityStandardContribution]],\r\n ) -> \"MetaIssueBody\":\r\n \"\"\"\r\n >>> AlicePleaseContributeRecommendedCommunityStandardsGitHubIssueOverlay.meta_issue_body(\r\n@@ -98,6 +134,7 @@ class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n - [] [License](https://github.com/intel/dffml/blob/main/LICENSE)\r\n - [] Security\r\n \"\"\"\r\n+ readme_issue, readme_path = recommended_community_standard_contributions[0]\r\n return \"\\n\".join(\r\n [\r\n \"- [\"\r\n```",
"editedAt": "2022-07-28T04:47:01Z"
},
{
"diff": "## 2022-07-27 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n\r\n### Refactoring and Thinking About Locking of Repos for Contributions\r\n\r\n- Metadata\r\n - Date: 2022-07-27 20:30 UTC -7\r\n- Saving this diff which was some work on dynamic application of overlay\r\n so as to support fixup of the OpImp for `meta_issue_body()`'s inputs.\r\n - We are going to table this for now for time reasons, but if someone\r\n wants to pick it up before @pdxjohnny is back in September, please\r\n give it a go (create an issue).\r\n- Noticed that we have an issue with adding new files and locking. The current\r\n lock is on the `git_repository/GitRepoSpec`.\r\n - We then convert to `AliceGitRepo`, at which point anything take `AliceGitRepo`\r\n- TODO\r\n - [ ] Auto fork repo before push\r\n - [ ] Update origin to push to\r\n - [ ] Create PR\r\n\r\n```diff\r\ndiff --git a/dffml/base.py b/dffml/base.py\r\nindex fea0ef7220..9d6cd886fa 100644\r\n--- a/dffml/base.py\r\n+++ b/dffml/base.py\r\n@@ -237,6 +237,7 @@ def convert_value(arg, value, *, dataclass=None):\r\n # before checking if the value is an instance of that\r\n # type. Since it doesn't make sense to check if the\r\n # value is an instance of something that's not a type.\r\n+ print(possible_type, value)\r\n if isinstance(possible_type, type) and isinstance(\r\n value, possible_type\r\n ):\r\ndiff --git a/dffml/df/system_context/system_context.py b/dffml/df/system_context/system_context.py\r\nindex e055a343f1..063547ad0c 100644\r\n--- a/dffml/df/system_context/system_context.py\r\n+++ b/dffml/df/system_context/system_context.py\r\n@@ -90,11 +90,11 @@ class SystemContextConfig:\r\n # links: 'SystemContextConfig'\r\n overlay: Union[\"SystemContextConfig\", DataFlow] = field(\r\n \"The overlay we will apply with any overlays to merge within it (see default overlay usage docs)\",\r\n- default=APPLY_INSTALLED_OVERLAYS,\r\n+ default=None,\r\n )\r\n orchestrator: Union[\"SystemContextConfig\", BaseOrchestrator] = field(\r\n \"The system context who's default flow will be used to produce an orchestrator which will be used to execute this system context including application of overlays\",\r\n- default_factory=lambda: MemoryOrchestrator,\r\n+ default=None,\r\n )\r\n \r\n \r\n@@ -131,6 +131,7 @@ class SystemContext(BaseDataFlowFacilitatorObject):\r\n )\r\n # TODO(alice) Apply overlay\r\n if self.config.overlay not in (None, APPLY_INSTALLED_OVERLAYS):\r\n+ print(self.config.overlay)\r\n breakpoint()\r\n raise NotImplementedError(\r\n \"Application of overlays within SystemContext class entry not yet supported\"\r\ndiff --git a/dffml/high_level/dataflow.py b/dffml/high_level/dataflow.py\r\nindex d180b5c302..d595ae1cb4 100644\r\n--- a/dffml/high_level/dataflow.py\r\n+++ b/dffml/high_level/dataflow.py\r\n@@ -206,12 +206,25 @@ async def run(\r\n # the of the one that got passed in and the overlay.\r\n if inspect.isclass(overlay):\r\n overlay = overlay()\r\n+ # TODO Move this into Overlay.load. Create a system context to\r\n+ # execute the overlay if it is not already.\r\n+ known_overlay_types = (DataFlow, SystemContext)\r\n+ if not isinstance(overlay, known_overlay_types):\r\n+ raise NotImplementedError(f\"{overlay} is not a known type {known_overlay_types}\")\r\n+ if isinstance(overlay, DataFlow):\r\n+ overlay = SystemContext(\r\n+ upstream=overlay,\r\n+ )\r\n # TODO(alice) overlay.deployment(\"native.python.overlay.apply\")\r\n apply_overlay = overlay.deployment()\r\n async for _ctx, result in apply_overlay(\r\n dataflow=dataflow,\r\n ):\r\n+ print(\"FEEDFACE\", _ctx, result)\r\n+ breakpoint()\r\n+ return\r\n continue\r\n+\r\n # TODO\r\n resultant_system_context = SystemContext(\r\n upstream=result[\"overlays_merged\"], overlay=None,\r\ndiff --git a/dffml/overlay/overlay.py b/dffml/overlay/overlay.py\r\nindex 13a50d9c10..0a01d38de9 100644\r\n--- a/dffml/overlay/overlay.py\r\n+++ b/dffml/overlay/overlay.py\r\n@@ -124,7 +124,7 @@ DFFML_MAIN_PACKAGE_OVERLAY = DataFlow(\r\n stage=Stage.OUTPUT,\r\n inputs={\r\n \"merged\": DataFlowAfterOverlaysMerged,\r\n- \"dataflow_we_are_applying_overlays_to_by_running_overlay_dataflow_and_passing_as_an_input\": DataFlowWeAreApplyingOverlaysToByRunningOverlayDataflowAndPassingAsAnInput,\r\n+ \"upstream\": DataFlowWeAreApplyingOverlaysToByRunningOverlayDataflowAndPassingAsAnInput,\r\n },\r\n outputs={\"overlayed\": DataFlowAfterOverlaysApplied,},\r\n multi_output=False,\r\n@@ -208,15 +208,12 @@ merge_implementations(\r\n DFFML_OVERLAYS_INSTALLED.update(auto_flow=True)\r\n \r\n # Create Class for calling operations within the System Context as methods\r\n-DFFMLOverlaysInstalled = SystemContext.subclass(\r\n- \"DFFMLOverlaysInstalled\",\r\n- {\r\n- \"upstream\": {\"default_factory\": lambda: DFFML_OVERLAYS_INSTALLED},\r\n- # TODO(alice) We'll need to make sure we have code to instantiate and\r\n- # instance of a class if only a class is given an not an instance.\r\n- \"overlay\": {\"default_factory\": lambda: None},\r\n- \"orchestrator\": {\"default_factory\": lambda: MemoryOrchestrator()},\r\n- },\r\n+DFFMLOverlaysInstalled = SystemContext(\r\n+ upstream=DFFML_OVERLAYS_INSTALLED,\r\n+ # TODO(alice) We'll need to make sure we have code to instantiate and\r\n+ # instance of a class if only a class is given an not an instance.\r\n+ overlay=None,\r\n+ orchestrator=MemoryOrchestrator(),\r\n )\r\n \r\n # Callee\r\ndiff --git a/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py b/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py\r\nindex 46d20c8c85..fff5d4928b 100644\r\n--- a/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py\r\n+++ b/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py\r\n@@ -18,6 +18,14 @@ from ....recommended_community_standards import AliceGitRepo, AlicePleaseContrib\r\n from ....dffml.operations.git.contribute import AlicePleaseContributeRecommendedCommunityStandardsOverlayGit\r\n \r\n \r\n+GitHubIssue = NewType(\"GitHubIssue\", str)\r\n+\r\n+\r\n+@dataclasses.dataclass\r\n+class RecommendedCommunityStandardContribution:\r\n+ path: pathlib.Path\r\n+ issue: GitHubIssue\r\n+\r\n \r\n class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n \"\"\"\r\n@@ -39,6 +47,7 @@ class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n MetaIssueTitle = NewType(\"MetaIssueTitle\", str)\r\n MetaIssueBody = NewType(\"MetaIssueBody\", str)\r\n \r\n+ # TODO This should only be run if there is a need for a README\r\n # body: Optional['ContributingIssueBody'] = \"References:\\n- https://docs.github.com/articles/setting-guidelines-for-repository-contributors/\",\r\n async def readme_issue(\r\n self,\r\n@@ -79,13 +88,40 @@ class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n \"\"\"\r\n ).lstrip()\r\n \r\n- # TODO(alice) There is a bug with Optional which can be revield by use here\r\n+\r\n @staticmethod\r\n+ async def readme_contribution(\r\n+ issue: \"ReadmeIssue\",\r\n+ path: AlicePleaseContributeRecommendedCommunityStandards.ReadmePath,\r\n+ ) -> RecommendedCommunityStandardContribution:\r\n+ return RecommendedCommunityStandardContribution(\r\n+ path=path,\r\n+ issue=issue,\r\n+ )\r\n+\r\n+\r\n+ \"\"\"\r\n+ @dffml.op(\r\n+ stage=dffml.Stage.OUTPUT,\r\n+ )\r\n+ async def collect_recommended_community_standard_contributions(\r\n+ self,\r\n+ ) -> List[RecommendedCommunityStandardContribution]:\r\n+ async with self.octx.ictx.definitions(self.ctx) as od:\r\n+ return [item async for item in od.inputs(RecommendedCommunityStandardContribution)]\r\n+ \"\"\"\r\n+\r\n+\r\n+ # TODO(alice) There is a bug with Optional which can be revield by use here\r\n def meta_issue_body(\r\n repo: AliceGitRepo,\r\n base: AlicePleaseContributeRecommendedCommunityStandardsOverlayGit.BaseBranch,\r\n- readme_path: AlicePleaseContributeRecommendedCommunityStandards.ReadmePath,\r\n- readme_issue: ReadmeIssue,\r\n+ # recommended_community_standard_contributions: List[RecommendedCommunityStandardContribution],\r\n+ # TODO On @op inspect paramter if Collect is found on an input, wrap the\r\n+ # operation in a subflow and add a generic version of\r\n+ # collect_recommended_community_standard_contributions to the flow as an\r\n+ # autostart or triggered via auto start operation.\r\n+ # recommended_community_standard_contributions: Collect[List[RecommendedCommunityStandardContribution]],\r\n ) -> \"MetaIssueBody\":\r\n \"\"\"\r\n >>> AlicePleaseContributeRecommendedCommunityStandardsGitHubIssueOverlay.meta_issue_body(\r\n@@ -98,6 +134,7 @@ class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n - [] [License](https://github.com/intel/dffml/blob/main/LICENSE)\r\n - [] Security\r\n \"\"\"\r\n+ readme_issue, readme_path = recommended_community_standard_contributions[0]\r\n return \"\\n\".join(\r\n [\r\n \"- [\"\r\n```",
"editedAt": "2022-07-28T03:42:48Z"
},
{
"diff": "## 2022-07-27 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n\r\n### Refactoring and Thinking About Locking of Repos for Contributions\r\n\r\n- Metadata\r\n - Date: 2022-07-27 20:30 UTC -7\r\n- Saving this diff which was some work on dynamic application of overlay\r\n so as to support fixup of the OpImp for `meta_issue_body()`'s inputs.\r\n - We are going to table this for now for time reasons, but if someone\r\n wants to pick it up before @pdxjohnny is back in September, please\r\n give it a go (create an issue).\r\n\r\n```diff\r\ndiff --git a/dffml/base.py b/dffml/base.py\r\nindex fea0ef7220..9d6cd886fa 100644\r\n--- a/dffml/base.py\r\n+++ b/dffml/base.py\r\n@@ -237,6 +237,7 @@ def convert_value(arg, value, *, dataclass=None):\r\n # before checking if the value is an instance of that\r\n # type. Since it doesn't make sense to check if the\r\n # value is an instance of something that's not a type.\r\n+ print(possible_type, value)\r\n if isinstance(possible_type, type) and isinstance(\r\n value, possible_type\r\n ):\r\ndiff --git a/dffml/df/system_context/system_context.py b/dffml/df/system_context/system_context.py\r\nindex e055a343f1..063547ad0c 100644\r\n--- a/dffml/df/system_context/system_context.py\r\n+++ b/dffml/df/system_context/system_context.py\r\n@@ -90,11 +90,11 @@ class SystemContextConfig:\r\n # links: 'SystemContextConfig'\r\n overlay: Union[\"SystemContextConfig\", DataFlow] = field(\r\n \"The overlay we will apply with any overlays to merge within it (see default overlay usage docs)\",\r\n- default=APPLY_INSTALLED_OVERLAYS,\r\n+ default=None,\r\n )\r\n orchestrator: Union[\"SystemContextConfig\", BaseOrchestrator] = field(\r\n \"The system context who's default flow will be used to produce an orchestrator which will be used to execute this system context including application of overlays\",\r\n- default_factory=lambda: MemoryOrchestrator,\r\n+ default=None,\r\n )\r\n \r\n \r\n@@ -131,6 +131,7 @@ class SystemContext(BaseDataFlowFacilitatorObject):\r\n )\r\n # TODO(alice) Apply overlay\r\n if self.config.overlay not in (None, APPLY_INSTALLED_OVERLAYS):\r\n+ print(self.config.overlay)\r\n breakpoint()\r\n raise NotImplementedError(\r\n \"Application of overlays within SystemContext class entry not yet supported\"\r\ndiff --git a/dffml/high_level/dataflow.py b/dffml/high_level/dataflow.py\r\nindex d180b5c302..d595ae1cb4 100644\r\n--- a/dffml/high_level/dataflow.py\r\n+++ b/dffml/high_level/dataflow.py\r\n@@ -206,12 +206,25 @@ async def run(\r\n # the of the one that got passed in and the overlay.\r\n if inspect.isclass(overlay):\r\n overlay = overlay()\r\n+ # TODO Move this into Overlay.load. Create a system context to\r\n+ # execute the overlay if it is not already.\r\n+ known_overlay_types = (DataFlow, SystemContext)\r\n+ if not isinstance(overlay, known_overlay_types):\r\n+ raise NotImplementedError(f\"{overlay} is not a known type {known_overlay_types}\")\r\n+ if isinstance(overlay, DataFlow):\r\n+ overlay = SystemContext(\r\n+ upstream=overlay,\r\n+ )\r\n # TODO(alice) overlay.deployment(\"native.python.overlay.apply\")\r\n apply_overlay = overlay.deployment()\r\n async for _ctx, result in apply_overlay(\r\n dataflow=dataflow,\r\n ):\r\n+ print(\"FEEDFACE\", _ctx, result)\r\n+ breakpoint()\r\n+ return\r\n continue\r\n+\r\n # TODO\r\n resultant_system_context = SystemContext(\r\n upstream=result[\"overlays_merged\"], overlay=None,\r\ndiff --git a/dffml/overlay/overlay.py b/dffml/overlay/overlay.py\r\nindex 13a50d9c10..0a01d38de9 100644\r\n--- a/dffml/overlay/overlay.py\r\n+++ b/dffml/overlay/overlay.py\r\n@@ -124,7 +124,7 @@ DFFML_MAIN_PACKAGE_OVERLAY = DataFlow(\r\n stage=Stage.OUTPUT,\r\n inputs={\r\n \"merged\": DataFlowAfterOverlaysMerged,\r\n- \"dataflow_we_are_applying_overlays_to_by_running_overlay_dataflow_and_passing_as_an_input\": DataFlowWeAreApplyingOverlaysToByRunningOverlayDataflowAndPassingAsAnInput,\r\n+ \"upstream\": DataFlowWeAreApplyingOverlaysToByRunningOverlayDataflowAndPassingAsAnInput,\r\n },\r\n outputs={\"overlayed\": DataFlowAfterOverlaysApplied,},\r\n multi_output=False,\r\n@@ -208,15 +208,12 @@ merge_implementations(\r\n DFFML_OVERLAYS_INSTALLED.update(auto_flow=True)\r\n \r\n # Create Class for calling operations within the System Context as methods\r\n-DFFMLOverlaysInstalled = SystemContext.subclass(\r\n- \"DFFMLOverlaysInstalled\",\r\n- {\r\n- \"upstream\": {\"default_factory\": lambda: DFFML_OVERLAYS_INSTALLED},\r\n- # TODO(alice) We'll need to make sure we have code to instantiate and\r\n- # instance of a class if only a class is given an not an instance.\r\n- \"overlay\": {\"default_factory\": lambda: None},\r\n- \"orchestrator\": {\"default_factory\": lambda: MemoryOrchestrator()},\r\n- },\r\n+DFFMLOverlaysInstalled = SystemContext(\r\n+ upstream=DFFML_OVERLAYS_INSTALLED,\r\n+ # TODO(alice) We'll need to make sure we have code to instantiate and\r\n+ # instance of a class if only a class is given an not an instance.\r\n+ overlay=None,\r\n+ orchestrator=MemoryOrchestrator(),\r\n )\r\n \r\n # Callee\r\ndiff --git a/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py b/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py\r\nindex 46d20c8c85..fff5d4928b 100644\r\n--- a/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py\r\n+++ b/entities/alice/alice/please/contribute/recommended_community_standards/alice/operations/github/issue.py\r\n@@ -18,6 +18,14 @@ from ....recommended_community_standards import AliceGitRepo, AlicePleaseContrib\r\n from ....dffml.operations.git.contribute import AlicePleaseContributeRecommendedCommunityStandardsOverlayGit\r\n \r\n \r\n+GitHubIssue = NewType(\"GitHubIssue\", str)\r\n+\r\n+\r\n+@dataclasses.dataclass\r\n+class RecommendedCommunityStandardContribution:\r\n+ path: pathlib.Path\r\n+ issue: GitHubIssue\r\n+\r\n \r\n class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n \"\"\"\r\n@@ -39,6 +47,7 @@ class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n MetaIssueTitle = NewType(\"MetaIssueTitle\", str)\r\n MetaIssueBody = NewType(\"MetaIssueBody\", str)\r\n \r\n+ # TODO This should only be run if there is a need for a README\r\n # body: Optional['ContributingIssueBody'] = \"References:\\n- https://docs.github.com/articles/setting-guidelines-for-repository-contributors/\",\r\n async def readme_issue(\r\n self,\r\n@@ -79,13 +88,40 @@ class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n \"\"\"\r\n ).lstrip()\r\n \r\n- # TODO(alice) There is a bug with Optional which can be revield by use here\r\n+\r\n @staticmethod\r\n+ async def readme_contribution(\r\n+ issue: \"ReadmeIssue\",\r\n+ path: AlicePleaseContributeRecommendedCommunityStandards.ReadmePath,\r\n+ ) -> RecommendedCommunityStandardContribution:\r\n+ return RecommendedCommunityStandardContribution(\r\n+ path=path,\r\n+ issue=issue,\r\n+ )\r\n+\r\n+\r\n+ \"\"\"\r\n+ @dffml.op(\r\n+ stage=dffml.Stage.OUTPUT,\r\n+ )\r\n+ async def collect_recommended_community_standard_contributions(\r\n+ self,\r\n+ ) -> List[RecommendedCommunityStandardContribution]:\r\n+ async with self.octx.ictx.definitions(self.ctx) as od:\r\n+ return [item async for item in od.inputs(RecommendedCommunityStandardContribution)]\r\n+ \"\"\"\r\n+\r\n+\r\n+ # TODO(alice) There is a bug with Optional which can be revield by use here\r\n def meta_issue_body(\r\n repo: AliceGitRepo,\r\n base: AlicePleaseContributeRecommendedCommunityStandardsOverlayGit.BaseBranch,\r\n- readme_path: AlicePleaseContributeRecommendedCommunityStandards.ReadmePath,\r\n- readme_issue: ReadmeIssue,\r\n+ # recommended_community_standard_contributions: List[RecommendedCommunityStandardContribution],\r\n+ # TODO On @op inspect paramter if Collect is found on an input, wrap the\r\n+ # operation in a subflow and add a generic version of\r\n+ # collect_recommended_community_standard_contributions to the flow as an\r\n+ # autostart or triggered via auto start operation.\r\n+ # recommended_community_standard_contributions: Collect[List[RecommendedCommunityStandardContribution]],\r\n ) -> \"MetaIssueBody\":\r\n \"\"\"\r\n >>> AlicePleaseContributeRecommendedCommunityStandardsGitHubIssueOverlay.meta_issue_body(\r\n@@ -98,6 +134,7 @@ class AlicePleaseContributeRecommendedCommunityStandardsOverlayGitHubIssue:\r\n - [] [License](https://github.com/intel/dffml/blob/main/LICENSE)\r\n - [] Security\r\n \"\"\"\r\n+ readme_issue, readme_path = recommended_community_standard_contributions[0]\r\n return \"\\n\".join(\r\n [\r\n \"- [\"\r\n```",
"editedAt": "2022-07-28T03:39:15Z"
},
{
"diff": "## 2022-07-27 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n\r\n### Refactoring and Thinking About Locking of Repos for Contributions\r\n\r\n- Metadata\r\n - Date: 2022-07-27 20:30 UTC -7\r\n- ",
"editedAt": "2022-07-28T03:37:01Z"
}
]
}
}
]
}
},
{
"body": "# 2022-07-28 Engineering Logs",
"createdAt": "2022-07-28T10:28:59Z",
"userContentEdits": {
"pageInfo": {
"endCursor": null,
"hasNextPage": false
},
"totalCount": 0,
"nodes": []
},
"replies": {
"pageInfo": {
"endCursor": "Y3Vyc29yOnYyOpK5MjAyMi0wNy0yOFQwMzoyOTo1NC0wNzowMM4AMd_d",
"hasNextPage": false
},
"totalCount": 1,
"nodes": [
{
"body": "## 2022-07-28 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n- Noticed that we have an issue with adding new files and locking. The current\r\n lock is on the `git_repository/GitRepoSpec`.\r\n - We then convert to `AliceGitRepo`, at which point anything take `AliceGitRepo`\r\n - alice: please: contribute: recommended community standards: Refactoring into overlays associated with each file contributed\r\n - Completed in 1a71dbe3ab3743430ce2783f4210a6cd807c36a1\r\n\r\n### 43\r\n\r\n```\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed.append(dffml.Input(value=repo, definition=definition, origin=('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result')))\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed\r\n[Input(value=origin, definition=writable.github.remote.origin), Input(value=master, definition=repo.git.base.branch), Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-hxnacg5_', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]\r\n```\r\n\r\n- Attempting to figure out why an operation is not being called\r\n - `contribute_readme_md` should be getting `base`, but is not.\r\n\r\n```\r\n{'_': {ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]},\r\n 'alternate_definitions': [],\r\n 'by_origin': {('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]},\r\n 'check_for_default_value': [repo.git.base.branch],\r\n 'contexts': [MemoryInputNetworkContextEntry(ctx=Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo), definitions={ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)], repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)], ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]}, by_origin={('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]})],\r\n 'ctx': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'dataflow': <dffml.df.types.DataFlow object at 0x7f177ec3b7f0>,\r\n 'definition': repo.git.base.branch,\r\n 'gather': {'base': [],\r\n 'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]},\r\n 'handle_string': \"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', \"\r\n \"URL='https://github.com/pdxjohnny/testaaaa'), \"\r\n 'definition=ReadmeGitRepo)',\r\n 'input_flow': InputFlow(inputs={'repo': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme': 'result'}], 'base': ['seed'], 'commit_message': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message': 'result'}]}, conditions=[]),\r\n 'input_name': 'base',\r\n 'input_source': 'seed',\r\n 'input_sources': ['seed'],\r\n 'item': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'operation': Operation(name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', inputs={'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}, outputs={'result': repo.readme.git.branch}, stage=<Stage.PROCESSING: 'processing'>, conditions=[], expand=[], instance_name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', validator=False, retry=0),\r\n 'origin': 'seed',\r\n 'origins': ['seed'],\r\n 'pprint': <function pprint at 0x7f1782d94670>,\r\n 'rctx': <dffml.df.memory.MemoryRedundancyCheckerContext object at 0x7f177ec5e220>,\r\n 'self': <dffml.df.memory.MemoryInputNetworkContext object at 0x7f177ec3ba30>}\r\n> /home/pdxjohnny/Documents/python/dffml/dffml/df/memory.py(788)gather_inputs()\r\n-> return\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n(Pdb) self.ctxhd.keys()\r\ndict_keys([\"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)\"])\r\n(Pdb) from pprint import pprint\r\n(Pdb) pprint(inputs.definitions)\r\n{ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]}\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n```\r\n\r\n- Suspect discarded because of mismatched origin, if not that, will check definition\r\n - Found out that it was seed vs. output origin mismatch\r\n - Found out that BaseBranch comes from OverlayGit\r\n - Registered OverlayGit as an overlay of OverlayReadme to that it's definitions get loaded\r\n - This way `auto_flow` will make the expected origin the output from OverlayGit operations\r\n rather than seed (the default when no matching outputs are seen on DataFlow init).\r\n - We found it created an infinite loop\r\n - Will try reusing redundancy checker, that seems to be doing well\r\n- https://github.com/intel/dffml/issues/1408\r\n- Now debugging why `readme_pr` not called, OverlayGit definitions were seen earlier\r\n on subflow start to be present, must be something else.\r\n - The logs tell us that alice_contribute_readme is returning `None`, which means\r\n that the downstream operation is not called, since None means no return value\r\n in this case.\r\n\r\n```\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme Outputs: None\r\n```\r\n\r\n- Future\r\n - `run_custom` Optionally support forward subflow\r\n- TODO\r\n - [ ] Set definition proprety `AliceGitRepo.lock` to `True`\r\n\r\n\r\n### 44\r\n\r\n- Found out that util: subprocess: run command events: Do not return after yield of stdout/err\r\n - Fixed in b6eea6ed4549f9e7a89aab6306a51213b2bf36c9\r\n\r\n```console\r\n$ (for i in $(echo determin_base_branch readme_pr_body contribute_readme_md github_owns_remote alice_contribute_readme); do grep -rn \"${i} Outputs\" .output/2022-07-28-14-11.txt; done) | sort | uniq | sort\r\n354:DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote Outputs: {'result': 'origin'}\r\n361:DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch Outputs: {'result': 'master'}\r\n450:DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body Outputs: {'result': 'Closes: https://github.com/pdxjohnny/testaaaa/issues/188'}\r\n472:DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md Outputs: {'result': 'alice-contribute-recommended-community-standards-readme'}\r\n479:DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme Outputs: None\r\n```\r\n\r\n```\r\n(Pdb)\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md Outputs: {'result': 'alice-contribute-recommended-community-standards-readme'}\r\n(Pdb) pprint(readme_dataflow.flow['alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr'].inputs)\r\n{'base': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch': 'result'}],\r\n 'body': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body': 'result'}],\r\n 'head': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md': 'result'}],\r\n 'origin': ['seed'],\r\n 'repo': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme': 'result'}],\r\n 'title': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_title': 'result'}]}\r\n```\r\n\r\n- origin is set to seed\r\n - `'origin': ['seed']` was there because `OverlayGitHub.github_owns_remote` is not in the flow\r\n - We forgot add it to `entry_points.txt`, added\r\n\r\n```console\r\n$ dffml service dev export alice.cli:AlicePleaseContributeCLIDataFlow | tee alice.please.contribute.recommended_community_standards.json\r\n$ (echo -e 'HTTP/1.0 200 OK\\n' && dffml dataflow diagram -shortname alice.please.contribute.recommended_community_standards.json) | nc -Nlp 9999;\r\n```\r\n\r\n- Opens\r\n - `guessed_repo_string_means_no_git_branch_given` is feeding `git_repo_default_branch` but `dffml dataflow diagram` just have a bug because it's not showing the connection.\r\n\r\n```mermaid\r\ngraph TD\r\nsubgraph a759a07029077edc5c37fea0326fa281[Processing Stage]\r\nstyle a759a07029077edc5c37fea0326fa281 fill:#afd388b5,stroke:#a4ca7a\r\nsubgraph 8cfb8cd5b8620de4a7ebe0dfec00771a[cli_has_repos]\r\nstyle 8cfb8cd5b8620de4a7ebe0dfec00771a fill:#fff4de,stroke:#cece71\r\nd493c90433d19f11f33c2d72cd144940[cli_has_repos]\r\ne07552ee3b6b7696cb3ddd786222eaad(cmd)\r\ne07552ee3b6b7696cb3ddd786222eaad --> d493c90433d19f11f33c2d72cd144940\r\ncee6b5fdd0b6fbd0539cdcdc7f5a3324(wanted)\r\ncee6b5fdd0b6fbd0539cdcdc7f5a3324 --> d493c90433d19f11f33c2d72cd144940\r\n79e1ea6822bff603a835fb8ee80c7ff3(result)\r\nd493c90433d19f11f33c2d72cd144940 --> 79e1ea6822bff603a835fb8ee80c7ff3\r\nend\r\nsubgraph 0c2b64320fb5666a034794bb2195ecf0[cli_is_asking_for_recommended_community_standards]\r\nstyle 0c2b64320fb5666a034794bb2195ecf0 fill:#fff4de,stroke:#cece71\r\n222ee6c0209f1f1b7a782bc1276868c7[cli_is_asking_for_recommended_community_standards]\r\n330f463830aa97e88917d5a9d1c21500(cmd)\r\n330f463830aa97e88917d5a9d1c21500 --> 222ee6c0209f1f1b7a782bc1276868c7\r\nba29b52e9c5aa88ea1caeeff29bfd491(result)\r\n222ee6c0209f1f1b7a782bc1276868c7 --> ba29b52e9c5aa88ea1caeeff29bfd491\r\nend\r\nsubgraph eac58e8db2b55cb9cc5474aaa402c93e[cli_is_meant_on_this_repo]\r\nstyle eac58e8db2b55cb9cc5474aaa402c93e fill:#fff4de,stroke:#cece71\r\n6c819ad0228b0e7094b33e0634da9a38[cli_is_meant_on_this_repo]\r\ndc7c5f0836f7d2564c402bf956722672(cmd)\r\ndc7c5f0836f7d2564c402bf956722672 --> 6c819ad0228b0e7094b33e0634da9a38\r\n58d8518cb0d6ef6ad35dc242486f1beb(wanted)\r\n58d8518cb0d6ef6ad35dc242486f1beb --> 6c819ad0228b0e7094b33e0634da9a38\r\n135ee61e3402d6fcbd7a219b0b4ccd73(result)\r\n6c819ad0228b0e7094b33e0634da9a38 --> 135ee61e3402d6fcbd7a219b0b4ccd73\r\nend\r\nsubgraph 37887bf260c5c8e9bd18038401008bbc[cli_run_on_repo]\r\nstyle 37887bf260c5c8e9bd18038401008bbc fill:#fff4de,stroke:#cece71\r\n9d1042f33352800e54d98c9c5a4223df[cli_run_on_repo]\r\ne824ae072860bc545fc7d55aa0bca479(repo)\r\ne824ae072860bc545fc7d55aa0bca479 --> 9d1042f33352800e54d98c9c5a4223df\r\n40109d487bb9f08608d8c5f6e747042f(result)\r\n9d1042f33352800e54d98c9c5a4223df --> 40109d487bb9f08608d8c5f6e747042f\r\nend\r\nsubgraph 66ecd0c1f2e08941c443ec9cd89ec589[guess_repo_string_is_directory]\r\nstyle 66ecd0c1f2e08941c443ec9cd89ec589 fill:#fff4de,stroke:#cece71\r\n737d719a0c348ff65456024ddbc530fe[guess_repo_string_is_directory]\r\n33d806f9b732bfd6b96ae2e9e4243a68(repo_string)\r\n33d806f9b732bfd6b96ae2e9e4243a68 --> 737d719a0c348ff65456024ddbc530fe\r\ndd5aab190ce844673819298c5b8fde76(result)\r\n737d719a0c348ff65456024ddbc530fe --> dd5aab190ce844673819298c5b8fde76\r\nend\r\nsubgraph 4ea6696419c4a0862a4f63ea1f60c751[create_branch_if_none_exists]\r\nstyle 4ea6696419c4a0862a4f63ea1f60c751 fill:#fff4de,stroke:#cece71\r\n502369b37882b300d6620d5b4020f5b2[create_branch_if_none_exists]\r\nfdcb9b6113856222e30e093f7c38065e(name)\r\nfdcb9b6113856222e30e093f7c38065e --> 502369b37882b300d6620d5b4020f5b2\r\nbdcf4b078985f4a390e4ed4beacffa65(repo)\r\nbdcf4b078985f4a390e4ed4beacffa65 --> 502369b37882b300d6620d5b4020f5b2\r\n5a5493ab86ab4053f1d44302e7bdddd6(result)\r\n502369b37882b300d6620d5b4020f5b2 --> 5a5493ab86ab4053f1d44302e7bdddd6\r\nend\r\nsubgraph b1d510183f6a4c3fde207a4656c72cb4[determin_base_branch]\r\nstyle b1d510183f6a4c3fde207a4656c72cb4 fill:#fff4de,stroke:#cece71\r\n476aecd4d4d712cda1879feba46ea109[determin_base_branch]\r\nff47cf65b58262acec28507f4427de45(default_branch)\r\nff47cf65b58262acec28507f4427de45 --> 476aecd4d4d712cda1879feba46ea109\r\n150204cd2d5a921deb53c312418379a1(result)\r\n476aecd4d4d712cda1879feba46ea109 --> 150204cd2d5a921deb53c312418379a1\r\nend\r\nsubgraph 2a08ff341f159c170b7fe017eaad2f18[git_repo_to_alice_git_repo]\r\nstyle 2a08ff341f159c170b7fe017eaad2f18 fill:#fff4de,stroke:#cece71\r\n7f74112f6d30c6289caa0a000e87edab[git_repo_to_alice_git_repo]\r\ne58180baf478fe910359358a3fa02234(repo)\r\ne58180baf478fe910359358a3fa02234 --> 7f74112f6d30c6289caa0a000e87edab\r\n9b92d5a346885079a2821c4d27cb5174(result)\r\n7f74112f6d30c6289caa0a000e87edab --> 9b92d5a346885079a2821c4d27cb5174\r\nend\r\nsubgraph b5d35aa8a8dcd28d22d47caad02676b0[guess_repo_string_is_url]\r\nstyle b5d35aa8a8dcd28d22d47caad02676b0 fill:#fff4de,stroke:#cece71\r\n0de074e71a32e30889b8bb400cf8db9f[guess_repo_string_is_url]\r\nc3bfe79b396a98ce2d9bfe772c9c20af(repo_string)\r\nc3bfe79b396a98ce2d9bfe772c9c20af --> 0de074e71a32e30889b8bb400cf8db9f\r\n2a1c620b0d510c3d8ed35deda41851c5(result)\r\n0de074e71a32e30889b8bb400cf8db9f --> 2a1c620b0d510c3d8ed35deda41851c5\r\nend\r\nsubgraph 60791520c6d124c0bf15e599132b0caf[guessed_repo_string_is_operations_git_url]\r\nstyle 60791520c6d124c0bf15e599132b0caf fill:#fff4de,stroke:#cece71\r\n102f173505d7b546236cdeff191369d4[guessed_repo_string_is_operations_git_url]\r\n4934c6211334318c63a5e91530171c9b(repo_url)\r\n4934c6211334318c63a5e91530171c9b --> 102f173505d7b546236cdeff191369d4\r\n8d0adc31da1a0919724baf73d047743c(result)\r\n102f173505d7b546236cdeff191369d4 --> 8d0adc31da1a0919724baf73d047743c\r\nend\r\nsubgraph f2c7b93622447999daab403713239ada[guessed_repo_string_means_no_git_branch_given]\r\nstyle f2c7b93622447999daab403713239ada fill:#fff4de,stroke:#cece71\r\nc8294a87e7aae8f7f9cb7f53e054fed5[guessed_repo_string_means_no_git_branch_given]\r\n5567dd8a6d7ae4fe86252db32e189a4d(repo_url)\r\n5567dd8a6d7ae4fe86252db32e189a4d --> c8294a87e7aae8f7f9cb7f53e054fed5\r\nd888e6b64b5e3496056088f14dab9894(result)\r\nc8294a87e7aae8f7f9cb7f53e054fed5 --> d888e6b64b5e3496056088f14dab9894\r\nend\r\nsubgraph 113addf4beee5305fdc79d2363608f9d[github_owns_remote]\r\nstyle 113addf4beee5305fdc79d2363608f9d fill:#fff4de,stroke:#cece71\r\n049b72b81b976fbb43607bfeeb0464c5[github_owns_remote]\r\n6c2b36393ffff6be0b4ad333df2d9419(remote)\r\n6c2b36393ffff6be0b4ad333df2d9419 --> 049b72b81b976fbb43607bfeeb0464c5\r\n19a9ee483c1743e6ecf0a2dc3b6f8c7a(repo)\r\n19a9ee483c1743e6ecf0a2dc3b6f8c7a --> 049b72b81b976fbb43607bfeeb0464c5\r\nb4cff8d194413f436d94f9d84ece0262(result)\r\n049b72b81b976fbb43607bfeeb0464c5 --> b4cff8d194413f436d94f9d84ece0262\r\nend\r\nsubgraph 43a22312a3d4f5c995c54c5196acc50a[create_meta_issue]\r\nstyle 43a22312a3d4f5c995c54c5196acc50a fill:#fff4de,stroke:#cece71\r\nd2345f23e5ef9f54c591c4a687c24575[create_meta_issue]\r\n1d79010ee1550f057c531130814c40b9(body)\r\n1d79010ee1550f057c531130814c40b9 --> d2345f23e5ef9f54c591c4a687c24575\r\n712d4318e59bd2dc629f0ddebb257ca3(repo)\r\n712d4318e59bd2dc629f0ddebb257ca3 --> d2345f23e5ef9f54c591c4a687c24575\r\n38a94f1c2162803f571489d707d61021(title)\r\n38a94f1c2162803f571489d707d61021 --> d2345f23e5ef9f54c591c4a687c24575\r\n2b22b4998ac3e6a64d82e0147e71ee1b(result)\r\nd2345f23e5ef9f54c591c4a687c24575 --> 2b22b4998ac3e6a64d82e0147e71ee1b\r\nend\r\nsubgraph f77af509c413b86b6cd7e107cc623c73[meta_issue_body]\r\nstyle f77af509c413b86b6cd7e107cc623c73 fill:#fff4de,stroke:#cece71\r\n69a9852570720a3d35cb9dd52a281f71[meta_issue_body]\r\n480d1cc478d23858e92d61225349b674(base)\r\n480d1cc478d23858e92d61225349b674 --> 69a9852570720a3d35cb9dd52a281f71\r\n37035ea5a06a282bdc1e1de24090a36d(readme_issue)\r\n37035ea5a06a282bdc1e1de24090a36d --> 69a9852570720a3d35cb9dd52a281f71\r\nfdf0dbb8ca47ee9022b3daeb8c7df9c0(readme_path)\r\nfdf0dbb8ca47ee9022b3daeb8c7df9c0 --> 69a9852570720a3d35cb9dd52a281f71\r\n428ca84f627c695362652cc7531fc27b(repo)\r\n428ca84f627c695362652cc7531fc27b --> 69a9852570720a3d35cb9dd52a281f71\r\n0cd9eb1ffb3c56d2b0a4359f800b1f20(result)\r\n69a9852570720a3d35cb9dd52a281f71 --> 0cd9eb1ffb3c56d2b0a4359f800b1f20\r\nend\r\nsubgraph 8506cba6514466fb6d65f33ace4b0eac[alice_contribute_readme]\r\nstyle 8506cba6514466fb6d65f33ace4b0eac fill:#fff4de,stroke:#cece71\r\nd4507d3d1c3fbf3e7e373eae24797667[alice_contribute_readme]\r\n68cf7d6869d027ca46a5fb4dbf7001d1(repo)\r\n68cf7d6869d027ca46a5fb4dbf7001d1 --> d4507d3d1c3fbf3e7e373eae24797667\r\n2f9316539862f119f7c525bf9061e974(result)\r\nd4507d3d1c3fbf3e7e373eae24797667 --> 2f9316539862f119f7c525bf9061e974\r\nend\r\nsubgraph 4233e6dc67cba131d4ef005af9c02959[contribute_readme_md]\r\nstyle 4233e6dc67cba131d4ef005af9c02959 fill:#fff4de,stroke:#cece71\r\n3db0ee5d6ab83886bded5afd86f3f88f[contribute_readme_md]\r\n37044e4d8610abe13849bc71a5cb7591(base)\r\n37044e4d8610abe13849bc71a5cb7591 --> 3db0ee5d6ab83886bded5afd86f3f88f\r\n631c051fe6050ae8f8fc3321ed00802d(commit_message)\r\n631c051fe6050ae8f8fc3321ed00802d --> 3db0ee5d6ab83886bded5afd86f3f88f\r\n182194bab776fc9bc406ed573d621b68(repo)\r\n182194bab776fc9bc406ed573d621b68 --> 3db0ee5d6ab83886bded5afd86f3f88f\r\n0ee9f524d2db12be854fe611fa8126dd(result)\r\n3db0ee5d6ab83886bded5afd86f3f88f --> 0ee9f524d2db12be854fe611fa8126dd\r\nend\r\nsubgraph a6080d9c45eb5f806a47152a18bf7830[create_readme_file_if_not_exists]\r\nstyle a6080d9c45eb5f806a47152a18bf7830 fill:#fff4de,stroke:#cece71\r\n67e388f508dd96084c37d236a2c67e67[create_readme_file_if_not_exists]\r\n54faf20bfdca0e63d07efb3e5a984cf1(readme_contents)\r\n54faf20bfdca0e63d07efb3e5a984cf1 --> 67e388f508dd96084c37d236a2c67e67\r\n8c089c362960ccf181742334a3dccaea(repo)\r\n8c089c362960ccf181742334a3dccaea --> 67e388f508dd96084c37d236a2c67e67\r\n5cc65e17d40e6a7223c1504f1c4b0d2a(result)\r\n67e388f508dd96084c37d236a2c67e67 --> 5cc65e17d40e6a7223c1504f1c4b0d2a\r\nend\r\nsubgraph e7757158127e9845b2915c16a7fa80c5[readme_commit_message]\r\nstyle e7757158127e9845b2915c16a7fa80c5 fill:#fff4de,stroke:#cece71\r\n562bdc535c7cebfc66dba920b1a17540[readme_commit_message]\r\n0af5cbea9050874a0a3cba73bb61f892(issue_url)\r\n0af5cbea9050874a0a3cba73bb61f892 --> 562bdc535c7cebfc66dba920b1a17540\r\n2641f3b67327fb7518ee34a3a40b0755(result)\r\n562bdc535c7cebfc66dba920b1a17540 --> 2641f3b67327fb7518ee34a3a40b0755\r\nend\r\nsubgraph cf99ff6fad80e9c21266b43fd67b2f7b[readme_issue]\r\nstyle cf99ff6fad80e9c21266b43fd67b2f7b fill:#fff4de,stroke:#cece71\r\nda44417f891a945085590baafffc2bdb[readme_issue]\r\nd519830ab4e07ec391038e8581889ac3(body)\r\nd519830ab4e07ec391038e8581889ac3 --> da44417f891a945085590baafffc2bdb\r\n268852aa3fa8ab0864a32abae5a333f7(repo)\r\n268852aa3fa8ab0864a32abae5a333f7 --> da44417f891a945085590baafffc2bdb\r\n77a11dd29af309cf43ed321446c4bf01(title)\r\n77a11dd29af309cf43ed321446c4bf01 --> da44417f891a945085590baafffc2bdb\r\n1d2360c9da18fac0b6ec142df8f3fbda(result)\r\nda44417f891a945085590baafffc2bdb --> 1d2360c9da18fac0b6ec142df8f3fbda\r\nend\r\nsubgraph 7ec0442cf2d95c367912e8abee09b217[readme_pr]\r\nstyle 7ec0442cf2d95c367912e8abee09b217 fill:#fff4de,stroke:#cece71\r\nbb314dc452cde5b6af5ea94dd277ba40[readme_pr]\r\n127d77c3047facc1daa621148c5a0a1d(base)\r\n127d77c3047facc1daa621148c5a0a1d --> bb314dc452cde5b6af5ea94dd277ba40\r\ncb421e4de153cbb912f7fbe57e4ad734(body)\r\ncb421e4de153cbb912f7fbe57e4ad734 --> bb314dc452cde5b6af5ea94dd277ba40\r\ncbf7a0b88c0a41953b245303f3e9a0d3(head)\r\ncbf7a0b88c0a41953b245303f3e9a0d3 --> bb314dc452cde5b6af5ea94dd277ba40\r\ne5f9ad44448abd2469b3fd9831f3d159(origin)\r\ne5f9ad44448abd2469b3fd9831f3d159 --> bb314dc452cde5b6af5ea94dd277ba40\r\na35aee6711d240378eb57a3932537ca1(repo)\r\na35aee6711d240378eb57a3932537ca1 --> bb314dc452cde5b6af5ea94dd277ba40\r\ndfcce88a7d605d46bf17de1159fbe5ad(title)\r\ndfcce88a7d605d46bf17de1159fbe5ad --> bb314dc452cde5b6af5ea94dd277ba40\r\na210a7890a7bea8d629368e02da3d806(result)\r\nbb314dc452cde5b6af5ea94dd277ba40 --> a210a7890a7bea8d629368e02da3d806\r\nend\r\nsubgraph 227eabb1f1c5cc0bc931714a03049e27[readme_pr_body]\r\nstyle 227eabb1f1c5cc0bc931714a03049e27 fill:#fff4de,stroke:#cece71\r\n2aea976396cfe68dacd9bc7d4a3f0cba[readme_pr_body]\r\nc5dfd309617c909b852afe0b4ae4a178(readme_issue)\r\nc5dfd309617c909b852afe0b4ae4a178 --> 2aea976396cfe68dacd9bc7d4a3f0cba\r\n40ddb5b508cb5643e7c91f7abdb72b84(result)\r\n2aea976396cfe68dacd9bc7d4a3f0cba --> 40ddb5b508cb5643e7c91f7abdb72b84\r\nend\r\nsubgraph 48687c84e69b3db0acca625cbe2e6b49[readme_pr_title]\r\nstyle 48687c84e69b3db0acca625cbe2e6b49 fill:#fff4de,stroke:#cece71\r\nd8668ff93f41bc241c8c540199cd7453[readme_pr_title]\r\n3b2137dd1c61d0dac7d4e40fd6746cfb(readme_issue)\r\n3b2137dd1c61d0dac7d4e40fd6746cfb --> d8668ff93f41bc241c8c540199cd7453\r\n956e024fde513b3a449eac9ee42d6ab3(result)\r\nd8668ff93f41bc241c8c540199cd7453 --> 956e024fde513b3a449eac9ee42d6ab3\r\nend\r\nsubgraph d3ec0ac85209a7256c89d20f758f09f4[check_if_valid_git_repository_URL]\r\nstyle d3ec0ac85209a7256c89d20f758f09f4 fill:#fff4de,stroke:#cece71\r\nf577c71443f6b04596b3fe0511326c40[check_if_valid_git_repository_URL]\r\n7440e73a8e8f864097f42162b74f2762(URL)\r\n7440e73a8e8f864097f42162b74f2762 --> f577c71443f6b04596b3fe0511326c40\r\n8e39b501b41c5d0e4596318f80a03210(valid)\r\nf577c71443f6b04596b3fe0511326c40 --> 8e39b501b41c5d0e4596318f80a03210\r\nend\r\nsubgraph af8da22d1318d911f29b95e687f87c5d[clone_git_repo]\r\nstyle af8da22d1318d911f29b95e687f87c5d fill:#fff4de,stroke:#cece71\r\n155b8fdb5524f6bfd5adbae4940ad8d5[clone_git_repo]\r\need77b9eea541e0c378c67395351099c(URL)\r\need77b9eea541e0c378c67395351099c --> 155b8fdb5524f6bfd5adbae4940ad8d5\r\n8b5928cd265dd2c44d67d076f60c8b05(ssh_key)\r\n8b5928cd265dd2c44d67d076f60c8b05 --> 155b8fdb5524f6bfd5adbae4940ad8d5\r\n4e1d5ea96e050e46ebf95ebc0713d54c(repo)\r\n155b8fdb5524f6bfd5adbae4940ad8d5 --> 4e1d5ea96e050e46ebf95ebc0713d54c\r\n6a44de06a4a3518b939b27c790f6cdce{valid_git_repository_URL}\r\n6a44de06a4a3518b939b27c790f6cdce --> 155b8fdb5524f6bfd5adbae4940ad8d5\r\nend\r\nsubgraph d3d91578caf34c0ae944b17853783406[git_repo_default_branch]\r\nstyle d3d91578caf34c0ae944b17853783406 fill:#fff4de,stroke:#cece71\r\n546062a96122df465d2631f31df4e9e3[git_repo_default_branch]\r\n181f1b33df4d795fbad2911ec7087e86(repo)\r\n181f1b33df4d795fbad2911ec7087e86 --> 546062a96122df465d2631f31df4e9e3\r\n57651c1bcd24b794dfc8d1794ab556d5(branch)\r\n546062a96122df465d2631f31df4e9e3 --> 57651c1bcd24b794dfc8d1794ab556d5\r\n5ed1ab77e726d7efdcc41e9e2f8039c6(remote)\r\n546062a96122df465d2631f31df4e9e3 --> 5ed1ab77e726d7efdcc41e9e2f8039c6\r\n4c3cdd5f15b7a846d291aac089e8a622{no_git_branch_given}\r\n4c3cdd5f15b7a846d291aac089e8a622 --> 546062a96122df465d2631f31df4e9e3\r\nend\r\nend\r\nsubgraph a4827add25f5c7d5895c5728b74e2beb[Cleanup Stage]\r\nstyle a4827add25f5c7d5895c5728b74e2beb fill:#afd388b5,stroke:#a4ca7a\r\nend\r\nsubgraph 58ca4d24d2767176f196436c2890b926[Output Stage]\r\nstyle 58ca4d24d2767176f196436c2890b926 fill:#afd388b5,stroke:#a4ca7a\r\nend\r\nsubgraph inputs[Inputs]\r\nstyle inputs fill:#f6dbf9,stroke:#a178ca\r\n128516cfa09b0383023eab52ee24878a(seed<br>dffml.util.cli.CMD)\r\n128516cfa09b0383023eab52ee24878a --> e07552ee3b6b7696cb3ddd786222eaad\r\nba29b52e9c5aa88ea1caeeff29bfd491 --> cee6b5fdd0b6fbd0539cdcdc7f5a3324\r\n128516cfa09b0383023eab52ee24878a(seed<br>dffml.util.cli.CMD)\r\n128516cfa09b0383023eab52ee24878a --> 330f463830aa97e88917d5a9d1c21500\r\n128516cfa09b0383023eab52ee24878a(seed<br>dffml.util.cli.CMD)\r\n128516cfa09b0383023eab52ee24878a --> dc7c5f0836f7d2564c402bf956722672\r\nba29b52e9c5aa88ea1caeeff29bfd491 --> 58d8518cb0d6ef6ad35dc242486f1beb\r\n79e1ea6822bff603a835fb8ee80c7ff3 --> e824ae072860bc545fc7d55aa0bca479\r\n135ee61e3402d6fcbd7a219b0b4ccd73 --> e824ae072860bc545fc7d55aa0bca479\r\n40109d487bb9f08608d8c5f6e747042f --> 33d806f9b732bfd6b96ae2e9e4243a68\r\n21ccfd2c550bd853d28581f0b0c9f9fe(seed<br>default.branch.name)\r\n21ccfd2c550bd853d28581f0b0c9f9fe --> fdcb9b6113856222e30e093f7c38065e\r\ndd5aab190ce844673819298c5b8fde76 --> bdcf4b078985f4a390e4ed4beacffa65\r\n9b92d5a346885079a2821c4d27cb5174 --> bdcf4b078985f4a390e4ed4beacffa65\r\n5a5493ab86ab4053f1d44302e7bdddd6 --> ff47cf65b58262acec28507f4427de45\r\n57651c1bcd24b794dfc8d1794ab556d5 --> ff47cf65b58262acec28507f4427de45\r\n4e1d5ea96e050e46ebf95ebc0713d54c --> e58180baf478fe910359358a3fa02234\r\n40109d487bb9f08608d8c5f6e747042f --> c3bfe79b396a98ce2d9bfe772c9c20af\r\n2a1c620b0d510c3d8ed35deda41851c5 --> 4934c6211334318c63a5e91530171c9b\r\n2a1c620b0d510c3d8ed35deda41851c5 --> 5567dd8a6d7ae4fe86252db32e189a4d\r\n5ed1ab77e726d7efdcc41e9e2f8039c6 --> 6c2b36393ffff6be0b4ad333df2d9419\r\ndd5aab190ce844673819298c5b8fde76 --> 19a9ee483c1743e6ecf0a2dc3b6f8c7a\r\n9b92d5a346885079a2821c4d27cb5174 --> 19a9ee483c1743e6ecf0a2dc3b6f8c7a\r\n0cd9eb1ffb3c56d2b0a4359f800b1f20 --> 1d79010ee1550f057c531130814c40b9\r\ndd5aab190ce844673819298c5b8fde76 --> 712d4318e59bd2dc629f0ddebb257ca3\r\n9b92d5a346885079a2821c4d27cb5174 --> 712d4318e59bd2dc629f0ddebb257ca3\r\ne7ad3469d98c3bd160363dbc47e2d741(seed<br>MetaIssueTitle)\r\ne7ad3469d98c3bd160363dbc47e2d741 --> 38a94f1c2162803f571489d707d61021\r\n150204cd2d5a921deb53c312418379a1 --> 480d1cc478d23858e92d61225349b674\r\n1d2360c9da18fac0b6ec142df8f3fbda --> 37035ea5a06a282bdc1e1de24090a36d\r\n5cc65e17d40e6a7223c1504f1c4b0d2a --> fdf0dbb8ca47ee9022b3daeb8c7df9c0\r\ndd5aab190ce844673819298c5b8fde76 --> 428ca84f627c695362652cc7531fc27b\r\n9b92d5a346885079a2821c4d27cb5174 --> 428ca84f627c695362652cc7531fc27b\r\ndd5aab190ce844673819298c5b8fde76 --> 68cf7d6869d027ca46a5fb4dbf7001d1\r\n9b92d5a346885079a2821c4d27cb5174 --> 68cf7d6869d027ca46a5fb4dbf7001d1\r\n150204cd2d5a921deb53c312418379a1 --> 37044e4d8610abe13849bc71a5cb7591\r\n2641f3b67327fb7518ee34a3a40b0755 --> 631c051fe6050ae8f8fc3321ed00802d\r\n2f9316539862f119f7c525bf9061e974 --> 182194bab776fc9bc406ed573d621b68\r\nd2708225c1f4c95d613a2645a17a5bc0(seed<br>repo.directory.readme.contents)\r\nd2708225c1f4c95d613a2645a17a5bc0 --> 54faf20bfdca0e63d07efb3e5a984cf1\r\n2f9316539862f119f7c525bf9061e974 --> 8c089c362960ccf181742334a3dccaea\r\n1d2360c9da18fac0b6ec142df8f3fbda --> 0af5cbea9050874a0a3cba73bb61f892\r\n1daacccd02f8117e67ad3cb8686a732c(seed<br>ReadmeIssueBody)\r\n1daacccd02f8117e67ad3cb8686a732c --> d519830ab4e07ec391038e8581889ac3\r\n2f9316539862f119f7c525bf9061e974 --> 268852aa3fa8ab0864a32abae5a333f7\r\n0c1ab2d4bda10e1083557833ae5c5da4(seed<br>ReadmeIssueTitle)\r\n0c1ab2d4bda10e1083557833ae5c5da4 --> 77a11dd29af309cf43ed321446c4bf01\r\n150204cd2d5a921deb53c312418379a1 --> 127d77c3047facc1daa621148c5a0a1d\r\n40ddb5b508cb5643e7c91f7abdb72b84 --> cb421e4de153cbb912f7fbe57e4ad734\r\n0ee9f524d2db12be854fe611fa8126dd --> cbf7a0b88c0a41953b245303f3e9a0d3\r\nb4cff8d194413f436d94f9d84ece0262 --> e5f9ad44448abd2469b3fd9831f3d159\r\n2f9316539862f119f7c525bf9061e974 --> a35aee6711d240378eb57a3932537ca1\r\n956e024fde513b3a449eac9ee42d6ab3 --> dfcce88a7d605d46bf17de1159fbe5ad\r\n1d2360c9da18fac0b6ec142df8f3fbda --> c5dfd309617c909b852afe0b4ae4a178\r\n1d2360c9da18fac0b6ec142df8f3fbda --> 3b2137dd1c61d0dac7d4e40fd6746cfb\r\n8d0adc31da1a0919724baf73d047743c --> 7440e73a8e8f864097f42162b74f2762\r\n8d0adc31da1a0919724baf73d047743c --> eed77b9eea541e0c378c67395351099c\r\na6ed501edbf561fda49a0a0a3ca310f0(seed<br>git_repo_ssh_key)\r\na6ed501edbf561fda49a0a0a3ca310f0 --> 8b5928cd265dd2c44d67d076f60c8b05\r\n8e39b501b41c5d0e4596318f80a03210 --> 6a44de06a4a3518b939b27c790f6cdce\r\n4e1d5ea96e050e46ebf95ebc0713d54c --> 181f1b33df4d795fbad2911ec7087e86\r\nend\r\n```\r\n\r\n- As of f8619a6362251d04929f4bfa395882b3257a3776 it works without meta issue\r\n creation: https://github.com/pdxjohnny/testaaaa/pull/193\r\n\r\n# 45\r\n\r\n```console\r\n$ gif-for-cli --rows $(tput lines) --cols $(tput cols) --export=/mnt/c/Users/Johnny/Downloads/alice-search-alices-adventures-in-wonderland-1.gif \"Alice's Adventures in Wonderland\"\r\n```\r\n\r\n```console\r\n$ watch -n 0.2 'grep FEEDFACE .output/$(ls .output/ | tail -n 1) | sed -e \"s/alice.please.contribute.recommended_community_standards.recommended_community_standards.//g\" | grep -i repo'\r\n```",
"createdAt": "2022-07-28T10:29:54Z",
"userContentEdits": {
"pageInfo": {
"endCursor": "Y3Vyc29yOnYyOpHOABPO8w==",
"hasNextPage": false
},
"totalCount": 21,
"nodes": [
{
"diff": "## 2022-07-28 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n- Noticed that we have an issue with adding new files and locking. The current\r\n lock is on the `git_repository/GitRepoSpec`.\r\n - We then convert to `AliceGitRepo`, at which point anything take `AliceGitRepo`\r\n - alice: please: contribute: recommended community standards: Refactoring into overlays associated with each file contributed\r\n - Completed in 1a71dbe3ab3743430ce2783f4210a6cd807c36a1\r\n\r\n### 43\r\n\r\n```\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed.append(dffml.Input(value=repo, definition=definition, origin=('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result')))\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed\r\n[Input(value=origin, definition=writable.github.remote.origin), Input(value=master, definition=repo.git.base.branch), Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-hxnacg5_', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]\r\n```\r\n\r\n- Attempting to figure out why an operation is not being called\r\n - `contribute_readme_md` should be getting `base`, but is not.\r\n\r\n```\r\n{'_': {ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]},\r\n 'alternate_definitions': [],\r\n 'by_origin': {('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]},\r\n 'check_for_default_value': [repo.git.base.branch],\r\n 'contexts': [MemoryInputNetworkContextEntry(ctx=Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo), definitions={ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)], repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)], ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]}, by_origin={('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]})],\r\n 'ctx': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'dataflow': <dffml.df.types.DataFlow object at 0x7f177ec3b7f0>,\r\n 'definition': repo.git.base.branch,\r\n 'gather': {'base': [],\r\n 'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]},\r\n 'handle_string': \"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', \"\r\n \"URL='https://github.com/pdxjohnny/testaaaa'), \"\r\n 'definition=ReadmeGitRepo)',\r\n 'input_flow': InputFlow(inputs={'repo': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme': 'result'}], 'base': ['seed'], 'commit_message': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message': 'result'}]}, conditions=[]),\r\n 'input_name': 'base',\r\n 'input_source': 'seed',\r\n 'input_sources': ['seed'],\r\n 'item': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'operation': Operation(name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', inputs={'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}, outputs={'result': repo.readme.git.branch}, stage=<Stage.PROCESSING: 'processing'>, conditions=[], expand=[], instance_name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', validator=False, retry=0),\r\n 'origin': 'seed',\r\n 'origins': ['seed'],\r\n 'pprint': <function pprint at 0x7f1782d94670>,\r\n 'rctx': <dffml.df.memory.MemoryRedundancyCheckerContext object at 0x7f177ec5e220>,\r\n 'self': <dffml.df.memory.MemoryInputNetworkContext object at 0x7f177ec3ba30>}\r\n> /home/pdxjohnny/Documents/python/dffml/dffml/df/memory.py(788)gather_inputs()\r\n-> return\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n(Pdb) self.ctxhd.keys()\r\ndict_keys([\"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)\"])\r\n(Pdb) from pprint import pprint\r\n(Pdb) pprint(inputs.definitions)\r\n{ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]}\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n```\r\n\r\n- Suspect discarded because of mismatched origin, if not that, will check definition\r\n - Found out that it was seed vs. output origin mismatch\r\n - Found out that BaseBranch comes from OverlayGit\r\n - Registered OverlayGit as an overlay of OverlayReadme to that it's definitions get loaded\r\n - This way `auto_flow` will make the expected origin the output from OverlayGit operations\r\n rather than seed (the default when no matching outputs are seen on DataFlow init).\r\n - We found it created an infinite loop\r\n - Will try reusing redundancy checker, that seems to be doing well\r\n- https://github.com/intel/dffml/issues/1408\r\n- Now debugging why `readme_pr` not called, OverlayGit definitions were seen earlier\r\n on subflow start to be present, must be something else.\r\n - The logs tell us that alice_contribute_readme is returning `None`, which means\r\n that the downstream operation is not called, since None means no return value\r\n in this case.\r\n\r\n```\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme Outputs: None\r\n```\r\n\r\n- Future\r\n - `run_custom` Optionally support forward subflow\r\n- TODO\r\n - [ ] Set definition proprety `AliceGitRepo.lock` to `True`\r\n\r\n\r\n### 44\r\n\r\n- Found out that util: subprocess: run command events: Do not return after yield of stdout/err\r\n - Fixed in b6eea6ed4549f9e7a89aab6306a51213b2bf36c9\r\n\r\n```console\r\n$ (for i in $(echo determin_base_branch readme_pr_body contribute_readme_md github_owns_remote alice_contribute_readme); do grep -rn \"${i} Outputs\" .output/2022-07-28-14-11.txt; done) | sort | uniq | sort\r\n354:DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote Outputs: {'result': 'origin'}\r\n361:DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch Outputs: {'result': 'master'}\r\n450:DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body Outputs: {'result': 'Closes: https://github.com/pdxjohnny/testaaaa/issues/188'}\r\n472:DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md Outputs: {'result': 'alice-contribute-recommended-community-standards-readme'}\r\n479:DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme Outputs: None\r\n```\r\n\r\n```\r\n(Pdb)\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md Outputs: {'result': 'alice-contribute-recommended-community-standards-readme'}\r\n(Pdb) pprint(readme_dataflow.flow['alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr'].inputs)\r\n{'base': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch': 'result'}],\r\n 'body': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body': 'result'}],\r\n 'head': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md': 'result'}],\r\n 'origin': ['seed'],\r\n 'repo': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme': 'result'}],\r\n 'title': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_title': 'result'}]}\r\n```\r\n\r\n- origin is set to seed\r\n - `'origin': ['seed']` was there because `OverlayGitHub.github_owns_remote` is not in the flow\r\n - We forgot add it to `entry_points.txt`, added\r\n\r\n```console\r\n$ dffml service dev export alice.cli:AlicePleaseContributeCLIDataFlow | tee alice.please.contribute.recommended_community_standards.json\r\n$ (echo -e 'HTTP/1.0 200 OK\\n' && dffml dataflow diagram -shortname alice.please.contribute.recommended_community_standards.json) | nc -Nlp 9999;\r\n```\r\n\r\n- Opens\r\n - `guessed_repo_string_means_no_git_branch_given` is feeding `git_repo_default_branch` but `dffml dataflow diagram` just have a bug because it's not showing the connection.\r\n\r\n```mermaid\r\ngraph TD\r\nsubgraph a759a07029077edc5c37fea0326fa281[Processing Stage]\r\nstyle a759a07029077edc5c37fea0326fa281 fill:#afd388b5,stroke:#a4ca7a\r\nsubgraph 8cfb8cd5b8620de4a7ebe0dfec00771a[cli_has_repos]\r\nstyle 8cfb8cd5b8620de4a7ebe0dfec00771a fill:#fff4de,stroke:#cece71\r\nd493c90433d19f11f33c2d72cd144940[cli_has_repos]\r\ne07552ee3b6b7696cb3ddd786222eaad(cmd)\r\ne07552ee3b6b7696cb3ddd786222eaad --> d493c90433d19f11f33c2d72cd144940\r\ncee6b5fdd0b6fbd0539cdcdc7f5a3324(wanted)\r\ncee6b5fdd0b6fbd0539cdcdc7f5a3324 --> d493c90433d19f11f33c2d72cd144940\r\n79e1ea6822bff603a835fb8ee80c7ff3(result)\r\nd493c90433d19f11f33c2d72cd144940 --> 79e1ea6822bff603a835fb8ee80c7ff3\r\nend\r\nsubgraph 0c2b64320fb5666a034794bb2195ecf0[cli_is_asking_for_recommended_community_standards]\r\nstyle 0c2b64320fb5666a034794bb2195ecf0 fill:#fff4de,stroke:#cece71\r\n222ee6c0209f1f1b7a782bc1276868c7[cli_is_asking_for_recommended_community_standards]\r\n330f463830aa97e88917d5a9d1c21500(cmd)\r\n330f463830aa97e88917d5a9d1c21500 --> 222ee6c0209f1f1b7a782bc1276868c7\r\nba29b52e9c5aa88ea1caeeff29bfd491(result)\r\n222ee6c0209f1f1b7a782bc1276868c7 --> ba29b52e9c5aa88ea1caeeff29bfd491\r\nend\r\nsubgraph eac58e8db2b55cb9cc5474aaa402c93e[cli_is_meant_on_this_repo]\r\nstyle eac58e8db2b55cb9cc5474aaa402c93e fill:#fff4de,stroke:#cece71\r\n6c819ad0228b0e7094b33e0634da9a38[cli_is_meant_on_this_repo]\r\ndc7c5f0836f7d2564c402bf956722672(cmd)\r\ndc7c5f0836f7d2564c402bf956722672 --> 6c819ad0228b0e7094b33e0634da9a38\r\n58d8518cb0d6ef6ad35dc242486f1beb(wanted)\r\n58d8518cb0d6ef6ad35dc242486f1beb --> 6c819ad0228b0e7094b33e0634da9a38\r\n135ee61e3402d6fcbd7a219b0b4ccd73(result)\r\n6c819ad0228b0e7094b33e0634da9a38 --> 135ee61e3402d6fcbd7a219b0b4ccd73\r\nend\r\nsubgraph 37887bf260c5c8e9bd18038401008bbc[cli_run_on_repo]\r\nstyle 37887bf260c5c8e9bd18038401008bbc fill:#fff4de,stroke:#cece71\r\n9d1042f33352800e54d98c9c5a4223df[cli_run_on_repo]\r\ne824ae072860bc545fc7d55aa0bca479(repo)\r\ne824ae072860bc545fc7d55aa0bca479 --> 9d1042f33352800e54d98c9c5a4223df\r\n40109d487bb9f08608d8c5f6e747042f(result)\r\n9d1042f33352800e54d98c9c5a4223df --> 40109d487bb9f08608d8c5f6e747042f\r\nend\r\nsubgraph 66ecd0c1f2e08941c443ec9cd89ec589[guess_repo_string_is_directory]\r\nstyle 66ecd0c1f2e08941c443ec9cd89ec589 fill:#fff4de,stroke:#cece71\r\n737d719a0c348ff65456024ddbc530fe[guess_repo_string_is_directory]\r\n33d806f9b732bfd6b96ae2e9e4243a68(repo_string)\r\n33d806f9b732bfd6b96ae2e9e4243a68 --> 737d719a0c348ff65456024ddbc530fe\r\ndd5aab190ce844673819298c5b8fde76(result)\r\n737d719a0c348ff65456024ddbc530fe --> dd5aab190ce844673819298c5b8fde76\r\nend\r\nsubgraph 4ea6696419c4a0862a4f63ea1f60c751[create_branch_if_none_exists]\r\nstyle 4ea6696419c4a0862a4f63ea1f60c751 fill:#fff4de,stroke:#cece71\r\n502369b37882b300d6620d5b4020f5b2[create_branch_if_none_exists]\r\nfdcb9b6113856222e30e093f7c38065e(name)\r\nfdcb9b6113856222e30e093f7c38065e --> 502369b37882b300d6620d5b4020f5b2\r\nbdcf4b078985f4a390e4ed4beacffa65(repo)\r\nbdcf4b078985f4a390e4ed4beacffa65 --> 502369b37882b300d6620d5b4020f5b2\r\n5a5493ab86ab4053f1d44302e7bdddd6(result)\r\n502369b37882b300d6620d5b4020f5b2 --> 5a5493ab86ab4053f1d44302e7bdddd6\r\nend\r\nsubgraph b1d510183f6a4c3fde207a4656c72cb4[determin_base_branch]\r\nstyle b1d510183f6a4c3fde207a4656c72cb4 fill:#fff4de,stroke:#cece71\r\n476aecd4d4d712cda1879feba46ea109[determin_base_branch]\r\nff47cf65b58262acec28507f4427de45(default_branch)\r\nff47cf65b58262acec28507f4427de45 --> 476aecd4d4d712cda1879feba46ea109\r\n150204cd2d5a921deb53c312418379a1(result)\r\n476aecd4d4d712cda1879feba46ea109 --> 150204cd2d5a921deb53c312418379a1\r\nend\r\nsubgraph 2a08ff341f159c170b7fe017eaad2f18[git_repo_to_alice_git_repo]\r\nstyle 2a08ff341f159c170b7fe017eaad2f18 fill:#fff4de,stroke:#cece71\r\n7f74112f6d30c6289caa0a000e87edab[git_repo_to_alice_git_repo]\r\ne58180baf478fe910359358a3fa02234(repo)\r\ne58180baf478fe910359358a3fa02234 --> 7f74112f6d30c6289caa0a000e87edab\r\n9b92d5a346885079a2821c4d27cb5174(result)\r\n7f74112f6d30c6289caa0a000e87edab --> 9b92d5a346885079a2821c4d27cb5174\r\nend\r\nsubgraph b5d35aa8a8dcd28d22d47caad02676b0[guess_repo_string_is_url]\r\nstyle b5d35aa8a8dcd28d22d47caad02676b0 fill:#fff4de,stroke:#cece71\r\n0de074e71a32e30889b8bb400cf8db9f[guess_repo_string_is_url]\r\nc3bfe79b396a98ce2d9bfe772c9c20af(repo_string)\r\nc3bfe79b396a98ce2d9bfe772c9c20af --> 0de074e71a32e30889b8bb400cf8db9f\r\n2a1c620b0d510c3d8ed35deda41851c5(result)\r\n0de074e71a32e30889b8bb400cf8db9f --> 2a1c620b0d510c3d8ed35deda41851c5\r\nend\r\nsubgraph 60791520c6d124c0bf15e599132b0caf[guessed_repo_string_is_operations_git_url]\r\nstyle 60791520c6d124c0bf15e599132b0caf fill:#fff4de,stroke:#cece71\r\n102f173505d7b546236cdeff191369d4[guessed_repo_string_is_operations_git_url]\r\n4934c6211334318c63a5e91530171c9b(repo_url)\r\n4934c6211334318c63a5e91530171c9b --> 102f173505d7b546236cdeff191369d4\r\n8d0adc31da1a0919724baf73d047743c(result)\r\n102f173505d7b546236cdeff191369d4 --> 8d0adc31da1a0919724baf73d047743c\r\nend\r\nsubgraph f2c7b93622447999daab403713239ada[guessed_repo_string_means_no_git_branch_given]\r\nstyle f2c7b93622447999daab403713239ada fill:#fff4de,stroke:#cece71\r\nc8294a87e7aae8f7f9cb7f53e054fed5[guessed_repo_string_means_no_git_branch_given]\r\n5567dd8a6d7ae4fe86252db32e189a4d(repo_url)\r\n5567dd8a6d7ae4fe86252db32e189a4d --> c8294a87e7aae8f7f9cb7f53e054fed5\r\nd888e6b64b5e3496056088f14dab9894(result)\r\nc8294a87e7aae8f7f9cb7f53e054fed5 --> d888e6b64b5e3496056088f14dab9894\r\nend\r\nsubgraph 113addf4beee5305fdc79d2363608f9d[github_owns_remote]\r\nstyle 113addf4beee5305fdc79d2363608f9d fill:#fff4de,stroke:#cece71\r\n049b72b81b976fbb43607bfeeb0464c5[github_owns_remote]\r\n6c2b36393ffff6be0b4ad333df2d9419(remote)\r\n6c2b36393ffff6be0b4ad333df2d9419 --> 049b72b81b976fbb43607bfeeb0464c5\r\n19a9ee483c1743e6ecf0a2dc3b6f8c7a(repo)\r\n19a9ee483c1743e6ecf0a2dc3b6f8c7a --> 049b72b81b976fbb43607bfeeb0464c5\r\nb4cff8d194413f436d94f9d84ece0262(result)\r\n049b72b81b976fbb43607bfeeb0464c5 --> b4cff8d194413f436d94f9d84ece0262\r\nend\r\nsubgraph 43a22312a3d4f5c995c54c5196acc50a[create_meta_issue]\r\nstyle 43a22312a3d4f5c995c54c5196acc50a fill:#fff4de,stroke:#cece71\r\nd2345f23e5ef9f54c591c4a687c24575[create_meta_issue]\r\n1d79010ee1550f057c531130814c40b9(body)\r\n1d79010ee1550f057c531130814c40b9 --> d2345f23e5ef9f54c591c4a687c24575\r\n712d4318e59bd2dc629f0ddebb257ca3(repo)\r\n712d4318e59bd2dc629f0ddebb257ca3 --> d2345f23e5ef9f54c591c4a687c24575\r\n38a94f1c2162803f571489d707d61021(title)\r\n38a94f1c2162803f571489d707d61021 --> d2345f23e5ef9f54c591c4a687c24575\r\n2b22b4998ac3e6a64d82e0147e71ee1b(result)\r\nd2345f23e5ef9f54c591c4a687c24575 --> 2b22b4998ac3e6a64d82e0147e71ee1b\r\nend\r\nsubgraph f77af509c413b86b6cd7e107cc623c73[meta_issue_body]\r\nstyle f77af509c413b86b6cd7e107cc623c73 fill:#fff4de,stroke:#cece71\r\n69a9852570720a3d35cb9dd52a281f71[meta_issue_body]\r\n480d1cc478d23858e92d61225349b674(base)\r\n480d1cc478d23858e92d61225349b674 --> 69a9852570720a3d35cb9dd52a281f71\r\n37035ea5a06a282bdc1e1de24090a36d(readme_issue)\r\n37035ea5a06a282bdc1e1de24090a36d --> 69a9852570720a3d35cb9dd52a281f71\r\nfdf0dbb8ca47ee9022b3daeb8c7df9c0(readme_path)\r\nfdf0dbb8ca47ee9022b3daeb8c7df9c0 --> 69a9852570720a3d35cb9dd52a281f71\r\n428ca84f627c695362652cc7531fc27b(repo)\r\n428ca84f627c695362652cc7531fc27b --> 69a9852570720a3d35cb9dd52a281f71\r\n0cd9eb1ffb3c56d2b0a4359f800b1f20(result)\r\n69a9852570720a3d35cb9dd52a281f71 --> 0cd9eb1ffb3c56d2b0a4359f800b1f20\r\nend\r\nsubgraph 8506cba6514466fb6d65f33ace4b0eac[alice_contribute_readme]\r\nstyle 8506cba6514466fb6d65f33ace4b0eac fill:#fff4de,stroke:#cece71\r\nd4507d3d1c3fbf3e7e373eae24797667[alice_contribute_readme]\r\n68cf7d6869d027ca46a5fb4dbf7001d1(repo)\r\n68cf7d6869d027ca46a5fb4dbf7001d1 --> d4507d3d1c3fbf3e7e373eae24797667\r\n2f9316539862f119f7c525bf9061e974(result)\r\nd4507d3d1c3fbf3e7e373eae24797667 --> 2f9316539862f119f7c525bf9061e974\r\nend\r\nsubgraph 4233e6dc67cba131d4ef005af9c02959[contribute_readme_md]\r\nstyle 4233e6dc67cba131d4ef005af9c02959 fill:#fff4de,stroke:#cece71\r\n3db0ee5d6ab83886bded5afd86f3f88f[contribute_readme_md]\r\n37044e4d8610abe13849bc71a5cb7591(base)\r\n37044e4d8610abe13849bc71a5cb7591 --> 3db0ee5d6ab83886bded5afd86f3f88f\r\n631c051fe6050ae8f8fc3321ed00802d(commit_message)\r\n631c051fe6050ae8f8fc3321ed00802d --> 3db0ee5d6ab83886bded5afd86f3f88f\r\n182194bab776fc9bc406ed573d621b68(repo)\r\n182194bab776fc9bc406ed573d621b68 --> 3db0ee5d6ab83886bded5afd86f3f88f\r\n0ee9f524d2db12be854fe611fa8126dd(result)\r\n3db0ee5d6ab83886bded5afd86f3f88f --> 0ee9f524d2db12be854fe611fa8126dd\r\nend\r\nsubgraph a6080d9c45eb5f806a47152a18bf7830[create_readme_file_if_not_exists]\r\nstyle a6080d9c45eb5f806a47152a18bf7830 fill:#fff4de,stroke:#cece71\r\n67e388f508dd96084c37d236a2c67e67[create_readme_file_if_not_exists]\r\n54faf20bfdca0e63d07efb3e5a984cf1(readme_contents)\r\n54faf20bfdca0e63d07efb3e5a984cf1 --> 67e388f508dd96084c37d236a2c67e67\r\n8c089c362960ccf181742334a3dccaea(repo)\r\n8c089c362960ccf181742334a3dccaea --> 67e388f508dd96084c37d236a2c67e67\r\n5cc65e17d40e6a7223c1504f1c4b0d2a(result)\r\n67e388f508dd96084c37d236a2c67e67 --> 5cc65e17d40e6a7223c1504f1c4b0d2a\r\nend\r\nsubgraph e7757158127e9845b2915c16a7fa80c5[readme_commit_message]\r\nstyle e7757158127e9845b2915c16a7fa80c5 fill:#fff4de,stroke:#cece71\r\n562bdc535c7cebfc66dba920b1a17540[readme_commit_message]\r\n0af5cbea9050874a0a3cba73bb61f892(issue_url)\r\n0af5cbea9050874a0a3cba73bb61f892 --> 562bdc535c7cebfc66dba920b1a17540\r\n2641f3b67327fb7518ee34a3a40b0755(result)\r\n562bdc535c7cebfc66dba920b1a17540 --> 2641f3b67327fb7518ee34a3a40b0755\r\nend\r\nsubgraph cf99ff6fad80e9c21266b43fd67b2f7b[readme_issue]\r\nstyle cf99ff6fad80e9c21266b43fd67b2f7b fill:#fff4de,stroke:#cece71\r\nda44417f891a945085590baafffc2bdb[readme_issue]\r\nd519830ab4e07ec391038e8581889ac3(body)\r\nd519830ab4e07ec391038e8581889ac3 --> da44417f891a945085590baafffc2bdb\r\n268852aa3fa8ab0864a32abae5a333f7(repo)\r\n268852aa3fa8ab0864a32abae5a333f7 --> da44417f891a945085590baafffc2bdb\r\n77a11dd29af309cf43ed321446c4bf01(title)\r\n77a11dd29af309cf43ed321446c4bf01 --> da44417f891a945085590baafffc2bdb\r\n1d2360c9da18fac0b6ec142df8f3fbda(result)\r\nda44417f891a945085590baafffc2bdb --> 1d2360c9da18fac0b6ec142df8f3fbda\r\nend\r\nsubgraph 7ec0442cf2d95c367912e8abee09b217[readme_pr]\r\nstyle 7ec0442cf2d95c367912e8abee09b217 fill:#fff4de,stroke:#cece71\r\nbb314dc452cde5b6af5ea94dd277ba40[readme_pr]\r\n127d77c3047facc1daa621148c5a0a1d(base)\r\n127d77c3047facc1daa621148c5a0a1d --> bb314dc452cde5b6af5ea94dd277ba40\r\ncb421e4de153cbb912f7fbe57e4ad734(body)\r\ncb421e4de153cbb912f7fbe57e4ad734 --> bb314dc452cde5b6af5ea94dd277ba40\r\ncbf7a0b88c0a41953b245303f3e9a0d3(head)\r\ncbf7a0b88c0a41953b245303f3e9a0d3 --> bb314dc452cde5b6af5ea94dd277ba40\r\ne5f9ad44448abd2469b3fd9831f3d159(origin)\r\ne5f9ad44448abd2469b3fd9831f3d159 --> bb314dc452cde5b6af5ea94dd277ba40\r\na35aee6711d240378eb57a3932537ca1(repo)\r\na35aee6711d240378eb57a3932537ca1 --> bb314dc452cde5b6af5ea94dd277ba40\r\ndfcce88a7d605d46bf17de1159fbe5ad(title)\r\ndfcce88a7d605d46bf17de1159fbe5ad --> bb314dc452cde5b6af5ea94dd277ba40\r\na210a7890a7bea8d629368e02da3d806(result)\r\nbb314dc452cde5b6af5ea94dd277ba40 --> a210a7890a7bea8d629368e02da3d806\r\nend\r\nsubgraph 227eabb1f1c5cc0bc931714a03049e27[readme_pr_body]\r\nstyle 227eabb1f1c5cc0bc931714a03049e27 fill:#fff4de,stroke:#cece71\r\n2aea976396cfe68dacd9bc7d4a3f0cba[readme_pr_body]\r\nc5dfd309617c909b852afe0b4ae4a178(readme_issue)\r\nc5dfd309617c909b852afe0b4ae4a178 --> 2aea976396cfe68dacd9bc7d4a3f0cba\r\n40ddb5b508cb5643e7c91f7abdb72b84(result)\r\n2aea976396cfe68dacd9bc7d4a3f0cba --> 40ddb5b508cb5643e7c91f7abdb72b84\r\nend\r\nsubgraph 48687c84e69b3db0acca625cbe2e6b49[readme_pr_title]\r\nstyle 48687c84e69b3db0acca625cbe2e6b49 fill:#fff4de,stroke:#cece71\r\nd8668ff93f41bc241c8c540199cd7453[readme_pr_title]\r\n3b2137dd1c61d0dac7d4e40fd6746cfb(readme_issue)\r\n3b2137dd1c61d0dac7d4e40fd6746cfb --> d8668ff93f41bc241c8c540199cd7453\r\n956e024fde513b3a449eac9ee42d6ab3(result)\r\nd8668ff93f41bc241c8c540199cd7453 --> 956e024fde513b3a449eac9ee42d6ab3\r\nend\r\nsubgraph d3ec0ac85209a7256c89d20f758f09f4[check_if_valid_git_repository_URL]\r\nstyle d3ec0ac85209a7256c89d20f758f09f4 fill:#fff4de,stroke:#cece71\r\nf577c71443f6b04596b3fe0511326c40[check_if_valid_git_repository_URL]\r\n7440e73a8e8f864097f42162b74f2762(URL)\r\n7440e73a8e8f864097f42162b74f2762 --> f577c71443f6b04596b3fe0511326c40\r\n8e39b501b41c5d0e4596318f80a03210(valid)\r\nf577c71443f6b04596b3fe0511326c40 --> 8e39b501b41c5d0e4596318f80a03210\r\nend\r\nsubgraph af8da22d1318d911f29b95e687f87c5d[clone_git_repo]\r\nstyle af8da22d1318d911f29b95e687f87c5d fill:#fff4de,stroke:#cece71\r\n155b8fdb5524f6bfd5adbae4940ad8d5[clone_git_repo]\r\need77b9eea541e0c378c67395351099c(URL)\r\need77b9eea541e0c378c67395351099c --> 155b8fdb5524f6bfd5adbae4940ad8d5\r\n8b5928cd265dd2c44d67d076f60c8b05(ssh_key)\r\n8b5928cd265dd2c44d67d076f60c8b05 --> 155b8fdb5524f6bfd5adbae4940ad8d5\r\n4e1d5ea96e050e46ebf95ebc0713d54c(repo)\r\n155b8fdb5524f6bfd5adbae4940ad8d5 --> 4e1d5ea96e050e46ebf95ebc0713d54c\r\n6a44de06a4a3518b939b27c790f6cdce{valid_git_repository_URL}\r\n6a44de06a4a3518b939b27c790f6cdce --> 155b8fdb5524f6bfd5adbae4940ad8d5\r\nend\r\nsubgraph d3d91578caf34c0ae944b17853783406[git_repo_default_branch]\r\nstyle d3d91578caf34c0ae944b17853783406 fill:#fff4de,stroke:#cece71\r\n546062a96122df465d2631f31df4e9e3[git_repo_default_branch]\r\n181f1b33df4d795fbad2911ec7087e86(repo)\r\n181f1b33df4d795fbad2911ec7087e86 --> 546062a96122df465d2631f31df4e9e3\r\n57651c1bcd24b794dfc8d1794ab556d5(branch)\r\n546062a96122df465d2631f31df4e9e3 --> 57651c1bcd24b794dfc8d1794ab556d5\r\n5ed1ab77e726d7efdcc41e9e2f8039c6(remote)\r\n546062a96122df465d2631f31df4e9e3 --> 5ed1ab77e726d7efdcc41e9e2f8039c6\r\n4c3cdd5f15b7a846d291aac089e8a622{no_git_branch_given}\r\n4c3cdd5f15b7a846d291aac089e8a622 --> 546062a96122df465d2631f31df4e9e3\r\nend\r\nend\r\nsubgraph a4827add25f5c7d5895c5728b74e2beb[Cleanup Stage]\r\nstyle a4827add25f5c7d5895c5728b74e2beb fill:#afd388b5,stroke:#a4ca7a\r\nend\r\nsubgraph 58ca4d24d2767176f196436c2890b926[Output Stage]\r\nstyle 58ca4d24d2767176f196436c2890b926 fill:#afd388b5,stroke:#a4ca7a\r\nend\r\nsubgraph inputs[Inputs]\r\nstyle inputs fill:#f6dbf9,stroke:#a178ca\r\n128516cfa09b0383023eab52ee24878a(seed<br>dffml.util.cli.CMD)\r\n128516cfa09b0383023eab52ee24878a --> e07552ee3b6b7696cb3ddd786222eaad\r\nba29b52e9c5aa88ea1caeeff29bfd491 --> cee6b5fdd0b6fbd0539cdcdc7f5a3324\r\n128516cfa09b0383023eab52ee24878a(seed<br>dffml.util.cli.CMD)\r\n128516cfa09b0383023eab52ee24878a --> 330f463830aa97e88917d5a9d1c21500\r\n128516cfa09b0383023eab52ee24878a(seed<br>dffml.util.cli.CMD)\r\n128516cfa09b0383023eab52ee24878a --> dc7c5f0836f7d2564c402bf956722672\r\nba29b52e9c5aa88ea1caeeff29bfd491 --> 58d8518cb0d6ef6ad35dc242486f1beb\r\n79e1ea6822bff603a835fb8ee80c7ff3 --> e824ae072860bc545fc7d55aa0bca479\r\n135ee61e3402d6fcbd7a219b0b4ccd73 --> e824ae072860bc545fc7d55aa0bca479\r\n40109d487bb9f08608d8c5f6e747042f --> 33d806f9b732bfd6b96ae2e9e4243a68\r\n21ccfd2c550bd853d28581f0b0c9f9fe(seed<br>default.branch.name)\r\n21ccfd2c550bd853d28581f0b0c9f9fe --> fdcb9b6113856222e30e093f7c38065e\r\ndd5aab190ce844673819298c5b8fde76 --> bdcf4b078985f4a390e4ed4beacffa65\r\n9b92d5a346885079a2821c4d27cb5174 --> bdcf4b078985f4a390e4ed4beacffa65\r\n5a5493ab86ab4053f1d44302e7bdddd6 --> ff47cf65b58262acec28507f4427de45\r\n57651c1bcd24b794dfc8d1794ab556d5 --> ff47cf65b58262acec28507f4427de45\r\n4e1d5ea96e050e46ebf95ebc0713d54c --> e58180baf478fe910359358a3fa02234\r\n40109d487bb9f08608d8c5f6e747042f --> c3bfe79b396a98ce2d9bfe772c9c20af\r\n2a1c620b0d510c3d8ed35deda41851c5 --> 4934c6211334318c63a5e91530171c9b\r\n2a1c620b0d510c3d8ed35deda41851c5 --> 5567dd8a6d7ae4fe86252db32e189a4d\r\n5ed1ab77e726d7efdcc41e9e2f8039c6 --> 6c2b36393ffff6be0b4ad333df2d9419\r\ndd5aab190ce844673819298c5b8fde76 --> 19a9ee483c1743e6ecf0a2dc3b6f8c7a\r\n9b92d5a346885079a2821c4d27cb5174 --> 19a9ee483c1743e6ecf0a2dc3b6f8c7a\r\n0cd9eb1ffb3c56d2b0a4359f800b1f20 --> 1d79010ee1550f057c531130814c40b9\r\ndd5aab190ce844673819298c5b8fde76 --> 712d4318e59bd2dc629f0ddebb257ca3\r\n9b92d5a346885079a2821c4d27cb5174 --> 712d4318e59bd2dc629f0ddebb257ca3\r\ne7ad3469d98c3bd160363dbc47e2d741(seed<br>MetaIssueTitle)\r\ne7ad3469d98c3bd160363dbc47e2d741 --> 38a94f1c2162803f571489d707d61021\r\n150204cd2d5a921deb53c312418379a1 --> 480d1cc478d23858e92d61225349b674\r\n1d2360c9da18fac0b6ec142df8f3fbda --> 37035ea5a06a282bdc1e1de24090a36d\r\n5cc65e17d40e6a7223c1504f1c4b0d2a --> fdf0dbb8ca47ee9022b3daeb8c7df9c0\r\ndd5aab190ce844673819298c5b8fde76 --> 428ca84f627c695362652cc7531fc27b\r\n9b92d5a346885079a2821c4d27cb5174 --> 428ca84f627c695362652cc7531fc27b\r\ndd5aab190ce844673819298c5b8fde76 --> 68cf7d6869d027ca46a5fb4dbf7001d1\r\n9b92d5a346885079a2821c4d27cb5174 --> 68cf7d6869d027ca46a5fb4dbf7001d1\r\n150204cd2d5a921deb53c312418379a1 --> 37044e4d8610abe13849bc71a5cb7591\r\n2641f3b67327fb7518ee34a3a40b0755 --> 631c051fe6050ae8f8fc3321ed00802d\r\n2f9316539862f119f7c525bf9061e974 --> 182194bab776fc9bc406ed573d621b68\r\nd2708225c1f4c95d613a2645a17a5bc0(seed<br>repo.directory.readme.contents)\r\nd2708225c1f4c95d613a2645a17a5bc0 --> 54faf20bfdca0e63d07efb3e5a984cf1\r\n2f9316539862f119f7c525bf9061e974 --> 8c089c362960ccf181742334a3dccaea\r\n1d2360c9da18fac0b6ec142df8f3fbda --> 0af5cbea9050874a0a3cba73bb61f892\r\n1daacccd02f8117e67ad3cb8686a732c(seed<br>ReadmeIssueBody)\r\n1daacccd02f8117e67ad3cb8686a732c --> d519830ab4e07ec391038e8581889ac3\r\n2f9316539862f119f7c525bf9061e974 --> 268852aa3fa8ab0864a32abae5a333f7\r\n0c1ab2d4bda10e1083557833ae5c5da4(seed<br>ReadmeIssueTitle)\r\n0c1ab2d4bda10e1083557833ae5c5da4 --> 77a11dd29af309cf43ed321446c4bf01\r\n150204cd2d5a921deb53c312418379a1 --> 127d77c3047facc1daa621148c5a0a1d\r\n40ddb5b508cb5643e7c91f7abdb72b84 --> cb421e4de153cbb912f7fbe57e4ad734\r\n0ee9f524d2db12be854fe611fa8126dd --> cbf7a0b88c0a41953b245303f3e9a0d3\r\nb4cff8d194413f436d94f9d84ece0262 --> e5f9ad44448abd2469b3fd9831f3d159\r\n2f9316539862f119f7c525bf9061e974 --> a35aee6711d240378eb57a3932537ca1\r\n956e024fde513b3a449eac9ee42d6ab3 --> dfcce88a7d605d46bf17de1159fbe5ad\r\n1d2360c9da18fac0b6ec142df8f3fbda --> c5dfd309617c909b852afe0b4ae4a178\r\n1d2360c9da18fac0b6ec142df8f3fbda --> 3b2137dd1c61d0dac7d4e40fd6746cfb\r\n8d0adc31da1a0919724baf73d047743c --> 7440e73a8e8f864097f42162b74f2762\r\n8d0adc31da1a0919724baf73d047743c --> eed77b9eea541e0c378c67395351099c\r\na6ed501edbf561fda49a0a0a3ca310f0(seed<br>git_repo_ssh_key)\r\na6ed501edbf561fda49a0a0a3ca310f0 --> 8b5928cd265dd2c44d67d076f60c8b05\r\n8e39b501b41c5d0e4596318f80a03210 --> 6a44de06a4a3518b939b27c790f6cdce\r\n4e1d5ea96e050e46ebf95ebc0713d54c --> 181f1b33df4d795fbad2911ec7087e86\r\nend\r\n```\r\n\r\n- As of f8619a6362251d04929f4bfa395882b3257a3776 it works without meta issue\r\n creation: https://github.com/pdxjohnny/testaaaa/pull/193\r\n\r\n# 45\r\n\r\n```console\r\n$ gif-for-cli --rows $(tput lines) --cols $(tput cols) --export=/mnt/c/Users/Johnny/Downloads/alice-search-alices-adventures-in-wonderland-1.gif \"Alice's Adventures in Wonderland\"\r\n```\r\n\r\n```console\r\n$ watch -n 0.2 'grep FEEDFACE .output/$(ls .output/ | tail -n 1) | sed -e \"s/alice.please.contribute.recommended_community_standards.recommended_community_standards.//g\" | grep -i repo'\r\n```",
"editedAt": "2022-07-29T13:19:47Z"
},
{
"diff": "## 2022-07-28 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n- Noticed that we have an issue with adding new files and locking. The current\r\n lock is on the `git_repository/GitRepoSpec`.\r\n - We then convert to `AliceGitRepo`, at which point anything take `AliceGitRepo`\r\n - alice: please: contribute: recommended community standards: Refactoring into overlays associated with each file contributed\r\n - Completed in 1a71dbe3ab3743430ce2783f4210a6cd807c36a1\r\n\r\n### 43\r\n\r\n```\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed.append(dffml.Input(value=repo, definition=definition, origin=('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result')))\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed\r\n[Input(value=origin, definition=writable.github.remote.origin), Input(value=master, definition=repo.git.base.branch), Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-hxnacg5_', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]\r\n```\r\n\r\n- Attempting to figure out why an operation is not being called\r\n - `contribute_readme_md` should be getting `base`, but is not.\r\n\r\n```\r\n{'_': {ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]},\r\n 'alternate_definitions': [],\r\n 'by_origin': {('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]},\r\n 'check_for_default_value': [repo.git.base.branch],\r\n 'contexts': [MemoryInputNetworkContextEntry(ctx=Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo), definitions={ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)], repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)], ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]}, by_origin={('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]})],\r\n 'ctx': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'dataflow': <dffml.df.types.DataFlow object at 0x7f177ec3b7f0>,\r\n 'definition': repo.git.base.branch,\r\n 'gather': {'base': [],\r\n 'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]},\r\n 'handle_string': \"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', \"\r\n \"URL='https://github.com/pdxjohnny/testaaaa'), \"\r\n 'definition=ReadmeGitRepo)',\r\n 'input_flow': InputFlow(inputs={'repo': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme': 'result'}], 'base': ['seed'], 'commit_message': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message': 'result'}]}, conditions=[]),\r\n 'input_name': 'base',\r\n 'input_source': 'seed',\r\n 'input_sources': ['seed'],\r\n 'item': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'operation': Operation(name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', inputs={'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}, outputs={'result': repo.readme.git.branch}, stage=<Stage.PROCESSING: 'processing'>, conditions=[], expand=[], instance_name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', validator=False, retry=0),\r\n 'origin': 'seed',\r\n 'origins': ['seed'],\r\n 'pprint': <function pprint at 0x7f1782d94670>,\r\n 'rctx': <dffml.df.memory.MemoryRedundancyCheckerContext object at 0x7f177ec5e220>,\r\n 'self': <dffml.df.memory.MemoryInputNetworkContext object at 0x7f177ec3ba30>}\r\n> /home/pdxjohnny/Documents/python/dffml/dffml/df/memory.py(788)gather_inputs()\r\n-> return\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n(Pdb) self.ctxhd.keys()\r\ndict_keys([\"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)\"])\r\n(Pdb) from pprint import pprint\r\n(Pdb) pprint(inputs.definitions)\r\n{ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]}\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n```\r\n\r\n- Suspect discarded because of mismatched origin, if not that, will check definition\r\n - Found out that it was seed vs. output origin mismatch\r\n - Found out that BaseBranch comes from OverlayGit\r\n - Registered OverlayGit as an overlay of OverlayReadme to that it's definitions get loaded\r\n - This way `auto_flow` will make the expected origin the output from OverlayGit operations\r\n rather than seed (the default when no matching outputs are seen on DataFlow init).\r\n - We found it created an infinite loop\r\n - Will try reusing redundancy checker, that seems to be doing well\r\n- https://github.com/intel/dffml/issues/1408\r\n- Now debugging why `readme_pr` not called, OverlayGit definitions were seen earlier\r\n on subflow start to be present, must be something else.\r\n - The logs tell us that alice_contribute_readme is returning `None`, which means\r\n that the downstream operation is not called, since None means no return value\r\n in this case.\r\n\r\n```\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme Outputs: None\r\n```\r\n\r\n- Future\r\n - `run_custom` Optionally support forward subflow\r\n- TODO\r\n - [ ] Set definition proprety `AliceGitRepo.lock` to `True`\r\n\r\n\r\n### 44\r\n\r\n- Found out that util: subprocess: run command events: Do not return after yield of stdout/err\r\n - Fixed in b6eea6ed4549f9e7a89aab6306a51213b2bf36c9\r\n\r\n```console\r\n$ (for i in $(echo determin_base_branch readme_pr_body contribute_readme_md github_owns_remote alice_contribute_readme); do grep -rn \"${i} Outputs\" .output/2022-07-28-14-11.txt; done) | sort | uniq | sort\r\n354:DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote Outputs: {'result': 'origin'}\r\n361:DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch Outputs: {'result': 'master'}\r\n450:DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body Outputs: {'result': 'Closes: https://github.com/pdxjohnny/testaaaa/issues/188'}\r\n472:DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md Outputs: {'result': 'alice-contribute-recommended-community-standards-readme'}\r\n479:DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme Outputs: None\r\n```\r\n\r\n```\r\n(Pdb)\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md Outputs: {'result': 'alice-contribute-recommended-community-standards-readme'}\r\n(Pdb) pprint(readme_dataflow.flow['alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr'].inputs)\r\n{'base': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch': 'result'}],\r\n 'body': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body': 'result'}],\r\n 'head': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md': 'result'}],\r\n 'origin': ['seed'],\r\n 'repo': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme': 'result'}],\r\n 'title': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_title': 'result'}]}\r\n```\r\n\r\n- origin is set to seed\r\n - `'origin': ['seed']` was there because `OverlayGitHub.github_owns_remote` is not in the flow\r\n - We forgot add it to `entry_points.txt`, added\r\n\r\n```console\r\n$ dffml service dev export alice.cli:AlicePleaseContributeCLIDataFlow | tee alice.please.contribute.recommended_community_standards.json\r\n$ (echo -e 'HTTP/1.0 200 OK\\n' && dffml dataflow diagram -shortname alice.please.contribute.recommended_community_standards.json) | nc -Nlp 9999;\r\n```\r\n\r\n- Opens\r\n - `guessed_repo_string_means_no_git_branch_given` is feeding `git_repo_default_branch` but `dffml dataflow diagram` just have a bug because it's not showing the connection.\r\n\r\n```mermaid\r\ngraph TD\r\nsubgraph a759a07029077edc5c37fea0326fa281[Processing Stage]\r\nstyle a759a07029077edc5c37fea0326fa281 fill:#afd388b5,stroke:#a4ca7a\r\nsubgraph 8cfb8cd5b8620de4a7ebe0dfec00771a[cli_has_repos]\r\nstyle 8cfb8cd5b8620de4a7ebe0dfec00771a fill:#fff4de,stroke:#cece71\r\nd493c90433d19f11f33c2d72cd144940[cli_has_repos]\r\ne07552ee3b6b7696cb3ddd786222eaad(cmd)\r\ne07552ee3b6b7696cb3ddd786222eaad --> d493c90433d19f11f33c2d72cd144940\r\ncee6b5fdd0b6fbd0539cdcdc7f5a3324(wanted)\r\ncee6b5fdd0b6fbd0539cdcdc7f5a3324 --> d493c90433d19f11f33c2d72cd144940\r\n79e1ea6822bff603a835fb8ee80c7ff3(result)\r\nd493c90433d19f11f33c2d72cd144940 --> 79e1ea6822bff603a835fb8ee80c7ff3\r\nend\r\nsubgraph 0c2b64320fb5666a034794bb2195ecf0[cli_is_asking_for_recommended_community_standards]\r\nstyle 0c2b64320fb5666a034794bb2195ecf0 fill:#fff4de,stroke:#cece71\r\n222ee6c0209f1f1b7a782bc1276868c7[cli_is_asking_for_recommended_community_standards]\r\n330f463830aa97e88917d5a9d1c21500(cmd)\r\n330f463830aa97e88917d5a9d1c21500 --> 222ee6c0209f1f1b7a782bc1276868c7\r\nba29b52e9c5aa88ea1caeeff29bfd491(result)\r\n222ee6c0209f1f1b7a782bc1276868c7 --> ba29b52e9c5aa88ea1caeeff29bfd491\r\nend\r\nsubgraph eac58e8db2b55cb9cc5474aaa402c93e[cli_is_meant_on_this_repo]\r\nstyle eac58e8db2b55cb9cc5474aaa402c93e fill:#fff4de,stroke:#cece71\r\n6c819ad0228b0e7094b33e0634da9a38[cli_is_meant_on_this_repo]\r\ndc7c5f0836f7d2564c402bf956722672(cmd)\r\ndc7c5f0836f7d2564c402bf956722672 --> 6c819ad0228b0e7094b33e0634da9a38\r\n58d8518cb0d6ef6ad35dc242486f1beb(wanted)\r\n58d8518cb0d6ef6ad35dc242486f1beb --> 6c819ad0228b0e7094b33e0634da9a38\r\n135ee61e3402d6fcbd7a219b0b4ccd73(result)\r\n6c819ad0228b0e7094b33e0634da9a38 --> 135ee61e3402d6fcbd7a219b0b4ccd73\r\nend\r\nsubgraph 37887bf260c5c8e9bd18038401008bbc[cli_run_on_repo]\r\nstyle 37887bf260c5c8e9bd18038401008bbc fill:#fff4de,stroke:#cece71\r\n9d1042f33352800e54d98c9c5a4223df[cli_run_on_repo]\r\ne824ae072860bc545fc7d55aa0bca479(repo)\r\ne824ae072860bc545fc7d55aa0bca479 --> 9d1042f33352800e54d98c9c5a4223df\r\n40109d487bb9f08608d8c5f6e747042f(result)\r\n9d1042f33352800e54d98c9c5a4223df --> 40109d487bb9f08608d8c5f6e747042f\r\nend\r\nsubgraph 66ecd0c1f2e08941c443ec9cd89ec589[guess_repo_string_is_directory]\r\nstyle 66ecd0c1f2e08941c443ec9cd89ec589 fill:#fff4de,stroke:#cece71\r\n737d719a0c348ff65456024ddbc530fe[guess_repo_string_is_directory]\r\n33d806f9b732bfd6b96ae2e9e4243a68(repo_string)\r\n33d806f9b732bfd6b96ae2e9e4243a68 --> 737d719a0c348ff65456024ddbc530fe\r\ndd5aab190ce844673819298c5b8fde76(result)\r\n737d719a0c348ff65456024ddbc530fe --> dd5aab190ce844673819298c5b8fde76\r\nend\r\nsubgraph 4ea6696419c4a0862a4f63ea1f60c751[create_branch_if_none_exists]\r\nstyle 4ea6696419c4a0862a4f63ea1f60c751 fill:#fff4de,stroke:#cece71\r\n502369b37882b300d6620d5b4020f5b2[create_branch_if_none_exists]\r\nfdcb9b6113856222e30e093f7c38065e(name)\r\nfdcb9b6113856222e30e093f7c38065e --> 502369b37882b300d6620d5b4020f5b2\r\nbdcf4b078985f4a390e4ed4beacffa65(repo)\r\nbdcf4b078985f4a390e4ed4beacffa65 --> 502369b37882b300d6620d5b4020f5b2\r\n5a5493ab86ab4053f1d44302e7bdddd6(result)\r\n502369b37882b300d6620d5b4020f5b2 --> 5a5493ab86ab4053f1d44302e7bdddd6\r\nend\r\nsubgraph b1d510183f6a4c3fde207a4656c72cb4[determin_base_branch]\r\nstyle b1d510183f6a4c3fde207a4656c72cb4 fill:#fff4de,stroke:#cece71\r\n476aecd4d4d712cda1879feba46ea109[determin_base_branch]\r\nff47cf65b58262acec28507f4427de45(default_branch)\r\nff47cf65b58262acec28507f4427de45 --> 476aecd4d4d712cda1879feba46ea109\r\n150204cd2d5a921deb53c312418379a1(result)\r\n476aecd4d4d712cda1879feba46ea109 --> 150204cd2d5a921deb53c312418379a1\r\nend\r\nsubgraph 2a08ff341f159c170b7fe017eaad2f18[git_repo_to_alice_git_repo]\r\nstyle 2a08ff341f159c170b7fe017eaad2f18 fill:#fff4de,stroke:#cece71\r\n7f74112f6d30c6289caa0a000e87edab[git_repo_to_alice_git_repo]\r\ne58180baf478fe910359358a3fa02234(repo)\r\ne58180baf478fe910359358a3fa02234 --> 7f74112f6d30c6289caa0a000e87edab\r\n9b92d5a346885079a2821c4d27cb5174(result)\r\n7f74112f6d30c6289caa0a000e87edab --> 9b92d5a346885079a2821c4d27cb5174\r\nend\r\nsubgraph b5d35aa8a8dcd28d22d47caad02676b0[guess_repo_string_is_url]\r\nstyle b5d35aa8a8dcd28d22d47caad02676b0 fill:#fff4de,stroke:#cece71\r\n0de074e71a32e30889b8bb400cf8db9f[guess_repo_string_is_url]\r\nc3bfe79b396a98ce2d9bfe772c9c20af(repo_string)\r\nc3bfe79b396a98ce2d9bfe772c9c20af --> 0de074e71a32e30889b8bb400cf8db9f\r\n2a1c620b0d510c3d8ed35deda41851c5(result)\r\n0de074e71a32e30889b8bb400cf8db9f --> 2a1c620b0d510c3d8ed35deda41851c5\r\nend\r\nsubgraph 60791520c6d124c0bf15e599132b0caf[guessed_repo_string_is_operations_git_url]\r\nstyle 60791520c6d124c0bf15e599132b0caf fill:#fff4de,stroke:#cece71\r\n102f173505d7b546236cdeff191369d4[guessed_repo_string_is_operations_git_url]\r\n4934c6211334318c63a5e91530171c9b(repo_url)\r\n4934c6211334318c63a5e91530171c9b --> 102f173505d7b546236cdeff191369d4\r\n8d0adc31da1a0919724baf73d047743c(result)\r\n102f173505d7b546236cdeff191369d4 --> 8d0adc31da1a0919724baf73d047743c\r\nend\r\nsubgraph f2c7b93622447999daab403713239ada[guessed_repo_string_means_no_git_branch_given]\r\nstyle f2c7b93622447999daab403713239ada fill:#fff4de,stroke:#cece71\r\nc8294a87e7aae8f7f9cb7f53e054fed5[guessed_repo_string_means_no_git_branch_given]\r\n5567dd8a6d7ae4fe86252db32e189a4d(repo_url)\r\n5567dd8a6d7ae4fe86252db32e189a4d --> c8294a87e7aae8f7f9cb7f53e054fed5\r\nd888e6b64b5e3496056088f14dab9894(result)\r\nc8294a87e7aae8f7f9cb7f53e054fed5 --> d888e6b64b5e3496056088f14dab9894\r\nend\r\nsubgraph 113addf4beee5305fdc79d2363608f9d[github_owns_remote]\r\nstyle 113addf4beee5305fdc79d2363608f9d fill:#fff4de,stroke:#cece71\r\n049b72b81b976fbb43607bfeeb0464c5[github_owns_remote]\r\n6c2b36393ffff6be0b4ad333df2d9419(remote)\r\n6c2b36393ffff6be0b4ad333df2d9419 --> 049b72b81b976fbb43607bfeeb0464c5\r\n19a9ee483c1743e6ecf0a2dc3b6f8c7a(repo)\r\n19a9ee483c1743e6ecf0a2dc3b6f8c7a --> 049b72b81b976fbb43607bfeeb0464c5\r\nb4cff8d194413f436d94f9d84ece0262(result)\r\n049b72b81b976fbb43607bfeeb0464c5 --> b4cff8d194413f436d94f9d84ece0262\r\nend\r\nsubgraph 43a22312a3d4f5c995c54c5196acc50a[create_meta_issue]\r\nstyle 43a22312a3d4f5c995c54c5196acc50a fill:#fff4de,stroke:#cece71\r\nd2345f23e5ef9f54c591c4a687c24575[create_meta_issue]\r\n1d79010ee1550f057c531130814c40b9(body)\r\n1d79010ee1550f057c531130814c40b9 --> d2345f23e5ef9f54c591c4a687c24575\r\n712d4318e59bd2dc629f0ddebb257ca3(repo)\r\n712d4318e59bd2dc629f0ddebb257ca3 --> d2345f23e5ef9f54c591c4a687c24575\r\n38a94f1c2162803f571489d707d61021(title)\r\n38a94f1c2162803f571489d707d61021 --> d2345f23e5ef9f54c591c4a687c24575\r\n2b22b4998ac3e6a64d82e0147e71ee1b(result)\r\nd2345f23e5ef9f54c591c4a687c24575 --> 2b22b4998ac3e6a64d82e0147e71ee1b\r\nend\r\nsubgraph f77af509c413b86b6cd7e107cc623c73[meta_issue_body]\r\nstyle f77af509c413b86b6cd7e107cc623c73 fill:#fff4de,stroke:#cece71\r\n69a9852570720a3d35cb9dd52a281f71[meta_issue_body]\r\n480d1cc478d23858e92d61225349b674(base)\r\n480d1cc478d23858e92d61225349b674 --> 69a9852570720a3d35cb9dd52a281f71\r\n37035ea5a06a282bdc1e1de24090a36d(readme_issue)\r\n37035ea5a06a282bdc1e1de24090a36d --> 69a9852570720a3d35cb9dd52a281f71\r\nfdf0dbb8ca47ee9022b3daeb8c7df9c0(readme_path)\r\nfdf0dbb8ca47ee9022b3daeb8c7df9c0 --> 69a9852570720a3d35cb9dd52a281f71\r\n428ca84f627c695362652cc7531fc27b(repo)\r\n428ca84f627c695362652cc7531fc27b --> 69a9852570720a3d35cb9dd52a281f71\r\n0cd9eb1ffb3c56d2b0a4359f800b1f20(result)\r\n69a9852570720a3d35cb9dd52a281f71 --> 0cd9eb1ffb3c56d2b0a4359f800b1f20\r\nend\r\nsubgraph 8506cba6514466fb6d65f33ace4b0eac[alice_contribute_readme]\r\nstyle 8506cba6514466fb6d65f33ace4b0eac fill:#fff4de,stroke:#cece71\r\nd4507d3d1c3fbf3e7e373eae24797667[alice_contribute_readme]\r\n68cf7d6869d027ca46a5fb4dbf7001d1(repo)\r\n68cf7d6869d027ca46a5fb4dbf7001d1 --> d4507d3d1c3fbf3e7e373eae24797667\r\n2f9316539862f119f7c525bf9061e974(result)\r\nd4507d3d1c3fbf3e7e373eae24797667 --> 2f9316539862f119f7c525bf9061e974\r\nend\r\nsubgraph 4233e6dc67cba131d4ef005af9c02959[contribute_readme_md]\r\nstyle 4233e6dc67cba131d4ef005af9c02959 fill:#fff4de,stroke:#cece71\r\n3db0ee5d6ab83886bded5afd86f3f88f[contribute_readme_md]\r\n37044e4d8610abe13849bc71a5cb7591(base)\r\n37044e4d8610abe13849bc71a5cb7591 --> 3db0ee5d6ab83886bded5afd86f3f88f\r\n631c051fe6050ae8f8fc3321ed00802d(commit_message)\r\n631c051fe6050ae8f8fc3321ed00802d --> 3db0ee5d6ab83886bded5afd86f3f88f\r\n182194bab776fc9bc406ed573d621b68(repo)\r\n182194bab776fc9bc406ed573d621b68 --> 3db0ee5d6ab83886bded5afd86f3f88f\r\n0ee9f524d2db12be854fe611fa8126dd(result)\r\n3db0ee5d6ab83886bded5afd86f3f88f --> 0ee9f524d2db12be854fe611fa8126dd\r\nend\r\nsubgraph a6080d9c45eb5f806a47152a18bf7830[create_readme_file_if_not_exists]\r\nstyle a6080d9c45eb5f806a47152a18bf7830 fill:#fff4de,stroke:#cece71\r\n67e388f508dd96084c37d236a2c67e67[create_readme_file_if_not_exists]\r\n54faf20bfdca0e63d07efb3e5a984cf1(readme_contents)\r\n54faf20bfdca0e63d07efb3e5a984cf1 --> 67e388f508dd96084c37d236a2c67e67\r\n8c089c362960ccf181742334a3dccaea(repo)\r\n8c089c362960ccf181742334a3dccaea --> 67e388f508dd96084c37d236a2c67e67\r\n5cc65e17d40e6a7223c1504f1c4b0d2a(result)\r\n67e388f508dd96084c37d236a2c67e67 --> 5cc65e17d40e6a7223c1504f1c4b0d2a\r\nend\r\nsubgraph e7757158127e9845b2915c16a7fa80c5[readme_commit_message]\r\nstyle e7757158127e9845b2915c16a7fa80c5 fill:#fff4de,stroke:#cece71\r\n562bdc535c7cebfc66dba920b1a17540[readme_commit_message]\r\n0af5cbea9050874a0a3cba73bb61f892(issue_url)\r\n0af5cbea9050874a0a3cba73bb61f892 --> 562bdc535c7cebfc66dba920b1a17540\r\n2641f3b67327fb7518ee34a3a40b0755(result)\r\n562bdc535c7cebfc66dba920b1a17540 --> 2641f3b67327fb7518ee34a3a40b0755\r\nend\r\nsubgraph cf99ff6fad80e9c21266b43fd67b2f7b[readme_issue]\r\nstyle cf99ff6fad80e9c21266b43fd67b2f7b fill:#fff4de,stroke:#cece71\r\nda44417f891a945085590baafffc2bdb[readme_issue]\r\nd519830ab4e07ec391038e8581889ac3(body)\r\nd519830ab4e07ec391038e8581889ac3 --> da44417f891a945085590baafffc2bdb\r\n268852aa3fa8ab0864a32abae5a333f7(repo)\r\n268852aa3fa8ab0864a32abae5a333f7 --> da44417f891a945085590baafffc2bdb\r\n77a11dd29af309cf43ed321446c4bf01(title)\r\n77a11dd29af309cf43ed321446c4bf01 --> da44417f891a945085590baafffc2bdb\r\n1d2360c9da18fac0b6ec142df8f3fbda(result)\r\nda44417f891a945085590baafffc2bdb --> 1d2360c9da18fac0b6ec142df8f3fbda\r\nend\r\nsubgraph 7ec0442cf2d95c367912e8abee09b217[readme_pr]\r\nstyle 7ec0442cf2d95c367912e8abee09b217 fill:#fff4de,stroke:#cece71\r\nbb314dc452cde5b6af5ea94dd277ba40[readme_pr]\r\n127d77c3047facc1daa621148c5a0a1d(base)\r\n127d77c3047facc1daa621148c5a0a1d --> bb314dc452cde5b6af5ea94dd277ba40\r\ncb421e4de153cbb912f7fbe57e4ad734(body)\r\ncb421e4de153cbb912f7fbe57e4ad734 --> bb314dc452cde5b6af5ea94dd277ba40\r\ncbf7a0b88c0a41953b245303f3e9a0d3(head)\r\ncbf7a0b88c0a41953b245303f3e9a0d3 --> bb314dc452cde5b6af5ea94dd277ba40\r\ne5f9ad44448abd2469b3fd9831f3d159(origin)\r\ne5f9ad44448abd2469b3fd9831f3d159 --> bb314dc452cde5b6af5ea94dd277ba40\r\na35aee6711d240378eb57a3932537ca1(repo)\r\na35aee6711d240378eb57a3932537ca1 --> bb314dc452cde5b6af5ea94dd277ba40\r\ndfcce88a7d605d46bf17de1159fbe5ad(title)\r\ndfcce88a7d605d46bf17de1159fbe5ad --> bb314dc452cde5b6af5ea94dd277ba40\r\na210a7890a7bea8d629368e02da3d806(result)\r\nbb314dc452cde5b6af5ea94dd277ba40 --> a210a7890a7bea8d629368e02da3d806\r\nend\r\nsubgraph 227eabb1f1c5cc0bc931714a03049e27[readme_pr_body]\r\nstyle 227eabb1f1c5cc0bc931714a03049e27 fill:#fff4de,stroke:#cece71\r\n2aea976396cfe68dacd9bc7d4a3f0cba[readme_pr_body]\r\nc5dfd309617c909b852afe0b4ae4a178(readme_issue)\r\nc5dfd309617c909b852afe0b4ae4a178 --> 2aea976396cfe68dacd9bc7d4a3f0cba\r\n40ddb5b508cb5643e7c91f7abdb72b84(result)\r\n2aea976396cfe68dacd9bc7d4a3f0cba --> 40ddb5b508cb5643e7c91f7abdb72b84\r\nend\r\nsubgraph 48687c84e69b3db0acca625cbe2e6b49[readme_pr_title]\r\nstyle 48687c84e69b3db0acca625cbe2e6b49 fill:#fff4de,stroke:#cece71\r\nd8668ff93f41bc241c8c540199cd7453[readme_pr_title]\r\n3b2137dd1c61d0dac7d4e40fd6746cfb(readme_issue)\r\n3b2137dd1c61d0dac7d4e40fd6746cfb --> d8668ff93f41bc241c8c540199cd7453\r\n956e024fde513b3a449eac9ee42d6ab3(result)\r\nd8668ff93f41bc241c8c540199cd7453 --> 956e024fde513b3a449eac9ee42d6ab3\r\nend\r\nsubgraph d3ec0ac85209a7256c89d20f758f09f4[check_if_valid_git_repository_URL]\r\nstyle d3ec0ac85209a7256c89d20f758f09f4 fill:#fff4de,stroke:#cece71\r\nf577c71443f6b04596b3fe0511326c40[check_if_valid_git_repository_URL]\r\n7440e73a8e8f864097f42162b74f2762(URL)\r\n7440e73a8e8f864097f42162b74f2762 --> f577c71443f6b04596b3fe0511326c40\r\n8e39b501b41c5d0e4596318f80a03210(valid)\r\nf577c71443f6b04596b3fe0511326c40 --> 8e39b501b41c5d0e4596318f80a03210\r\nend\r\nsubgraph af8da22d1318d911f29b95e687f87c5d[clone_git_repo]\r\nstyle af8da22d1318d911f29b95e687f87c5d fill:#fff4de,stroke:#cece71\r\n155b8fdb5524f6bfd5adbae4940ad8d5[clone_git_repo]\r\need77b9eea541e0c378c67395351099c(URL)\r\need77b9eea541e0c378c67395351099c --> 155b8fdb5524f6bfd5adbae4940ad8d5\r\n8b5928cd265dd2c44d67d076f60c8b05(ssh_key)\r\n8b5928cd265dd2c44d67d076f60c8b05 --> 155b8fdb5524f6bfd5adbae4940ad8d5\r\n4e1d5ea96e050e46ebf95ebc0713d54c(repo)\r\n155b8fdb5524f6bfd5adbae4940ad8d5 --> 4e1d5ea96e050e46ebf95ebc0713d54c\r\n6a44de06a4a3518b939b27c790f6cdce{valid_git_repository_URL}\r\n6a44de06a4a3518b939b27c790f6cdce --> 155b8fdb5524f6bfd5adbae4940ad8d5\r\nend\r\nsubgraph d3d91578caf34c0ae944b17853783406[git_repo_default_branch]\r\nstyle d3d91578caf34c0ae944b17853783406 fill:#fff4de,stroke:#cece71\r\n546062a96122df465d2631f31df4e9e3[git_repo_default_branch]\r\n181f1b33df4d795fbad2911ec7087e86(repo)\r\n181f1b33df4d795fbad2911ec7087e86 --> 546062a96122df465d2631f31df4e9e3\r\n57651c1bcd24b794dfc8d1794ab556d5(branch)\r\n546062a96122df465d2631f31df4e9e3 --> 57651c1bcd24b794dfc8d1794ab556d5\r\n5ed1ab77e726d7efdcc41e9e2f8039c6(remote)\r\n546062a96122df465d2631f31df4e9e3 --> 5ed1ab77e726d7efdcc41e9e2f8039c6\r\n4c3cdd5f15b7a846d291aac089e8a622{no_git_branch_given}\r\n4c3cdd5f15b7a846d291aac089e8a622 --> 546062a96122df465d2631f31df4e9e3\r\nend\r\nend\r\nsubgraph a4827add25f5c7d5895c5728b74e2beb[Cleanup Stage]\r\nstyle a4827add25f5c7d5895c5728b74e2beb fill:#afd388b5,stroke:#a4ca7a\r\nend\r\nsubgraph 58ca4d24d2767176f196436c2890b926[Output Stage]\r\nstyle 58ca4d24d2767176f196436c2890b926 fill:#afd388b5,stroke:#a4ca7a\r\nend\r\nsubgraph inputs[Inputs]\r\nstyle inputs fill:#f6dbf9,stroke:#a178ca\r\n128516cfa09b0383023eab52ee24878a(seed<br>dffml.util.cli.CMD)\r\n128516cfa09b0383023eab52ee24878a --> e07552ee3b6b7696cb3ddd786222eaad\r\nba29b52e9c5aa88ea1caeeff29bfd491 --> cee6b5fdd0b6fbd0539cdcdc7f5a3324\r\n128516cfa09b0383023eab52ee24878a(seed<br>dffml.util.cli.CMD)\r\n128516cfa09b0383023eab52ee24878a --> 330f463830aa97e88917d5a9d1c21500\r\n128516cfa09b0383023eab52ee24878a(seed<br>dffml.util.cli.CMD)\r\n128516cfa09b0383023eab52ee24878a --> dc7c5f0836f7d2564c402bf956722672\r\nba29b52e9c5aa88ea1caeeff29bfd491 --> 58d8518cb0d6ef6ad35dc242486f1beb\r\n79e1ea6822bff603a835fb8ee80c7ff3 --> e824ae072860bc545fc7d55aa0bca479\r\n135ee61e3402d6fcbd7a219b0b4ccd73 --> e824ae072860bc545fc7d55aa0bca479\r\n40109d487bb9f08608d8c5f6e747042f --> 33d806f9b732bfd6b96ae2e9e4243a68\r\n21ccfd2c550bd853d28581f0b0c9f9fe(seed<br>default.branch.name)\r\n21ccfd2c550bd853d28581f0b0c9f9fe --> fdcb9b6113856222e30e093f7c38065e\r\ndd5aab190ce844673819298c5b8fde76 --> bdcf4b078985f4a390e4ed4beacffa65\r\n9b92d5a346885079a2821c4d27cb5174 --> bdcf4b078985f4a390e4ed4beacffa65\r\n5a5493ab86ab4053f1d44302e7bdddd6 --> ff47cf65b58262acec28507f4427de45\r\n57651c1bcd24b794dfc8d1794ab556d5 --> ff47cf65b58262acec28507f4427de45\r\n4e1d5ea96e050e46ebf95ebc0713d54c --> e58180baf478fe910359358a3fa02234\r\n40109d487bb9f08608d8c5f6e747042f --> c3bfe79b396a98ce2d9bfe772c9c20af\r\n2a1c620b0d510c3d8ed35deda41851c5 --> 4934c6211334318c63a5e91530171c9b\r\n2a1c620b0d510c3d8ed35deda41851c5 --> 5567dd8a6d7ae4fe86252db32e189a4d\r\n5ed1ab77e726d7efdcc41e9e2f8039c6 --> 6c2b36393ffff6be0b4ad333df2d9419\r\ndd5aab190ce844673819298c5b8fde76 --> 19a9ee483c1743e6ecf0a2dc3b6f8c7a\r\n9b92d5a346885079a2821c4d27cb5174 --> 19a9ee483c1743e6ecf0a2dc3b6f8c7a\r\n0cd9eb1ffb3c56d2b0a4359f800b1f20 --> 1d79010ee1550f057c531130814c40b9\r\ndd5aab190ce844673819298c5b8fde76 --> 712d4318e59bd2dc629f0ddebb257ca3\r\n9b92d5a346885079a2821c4d27cb5174 --> 712d4318e59bd2dc629f0ddebb257ca3\r\ne7ad3469d98c3bd160363dbc47e2d741(seed<br>MetaIssueTitle)\r\ne7ad3469d98c3bd160363dbc47e2d741 --> 38a94f1c2162803f571489d707d61021\r\n150204cd2d5a921deb53c312418379a1 --> 480d1cc478d23858e92d61225349b674\r\n1d2360c9da18fac0b6ec142df8f3fbda --> 37035ea5a06a282bdc1e1de24090a36d\r\n5cc65e17d40e6a7223c1504f1c4b0d2a --> fdf0dbb8ca47ee9022b3daeb8c7df9c0\r\ndd5aab190ce844673819298c5b8fde76 --> 428ca84f627c695362652cc7531fc27b\r\n9b92d5a346885079a2821c4d27cb5174 --> 428ca84f627c695362652cc7531fc27b\r\ndd5aab190ce844673819298c5b8fde76 --> 68cf7d6869d027ca46a5fb4dbf7001d1\r\n9b92d5a346885079a2821c4d27cb5174 --> 68cf7d6869d027ca46a5fb4dbf7001d1\r\n150204cd2d5a921deb53c312418379a1 --> 37044e4d8610abe13849bc71a5cb7591\r\n2641f3b67327fb7518ee34a3a40b0755 --> 631c051fe6050ae8f8fc3321ed00802d\r\n2f9316539862f119f7c525bf9061e974 --> 182194bab776fc9bc406ed573d621b68\r\nd2708225c1f4c95d613a2645a17a5bc0(seed<br>repo.directory.readme.contents)\r\nd2708225c1f4c95d613a2645a17a5bc0 --> 54faf20bfdca0e63d07efb3e5a984cf1\r\n2f9316539862f119f7c525bf9061e974 --> 8c089c362960ccf181742334a3dccaea\r\n1d2360c9da18fac0b6ec142df8f3fbda --> 0af5cbea9050874a0a3cba73bb61f892\r\n1daacccd02f8117e67ad3cb8686a732c(seed<br>ReadmeIssueBody)\r\n1daacccd02f8117e67ad3cb8686a732c --> d519830ab4e07ec391038e8581889ac3\r\n2f9316539862f119f7c525bf9061e974 --> 268852aa3fa8ab0864a32abae5a333f7\r\n0c1ab2d4bda10e1083557833ae5c5da4(seed<br>ReadmeIssueTitle)\r\n0c1ab2d4bda10e1083557833ae5c5da4 --> 77a11dd29af309cf43ed321446c4bf01\r\n150204cd2d5a921deb53c312418379a1 --> 127d77c3047facc1daa621148c5a0a1d\r\n40ddb5b508cb5643e7c91f7abdb72b84 --> cb421e4de153cbb912f7fbe57e4ad734\r\n0ee9f524d2db12be854fe611fa8126dd --> cbf7a0b88c0a41953b245303f3e9a0d3\r\nb4cff8d194413f436d94f9d84ece0262 --> e5f9ad44448abd2469b3fd9831f3d159\r\n2f9316539862f119f7c525bf9061e974 --> a35aee6711d240378eb57a3932537ca1\r\n956e024fde513b3a449eac9ee42d6ab3 --> dfcce88a7d605d46bf17de1159fbe5ad\r\n1d2360c9da18fac0b6ec142df8f3fbda --> c5dfd309617c909b852afe0b4ae4a178\r\n1d2360c9da18fac0b6ec142df8f3fbda --> 3b2137dd1c61d0dac7d4e40fd6746cfb\r\n8d0adc31da1a0919724baf73d047743c --> 7440e73a8e8f864097f42162b74f2762\r\n8d0adc31da1a0919724baf73d047743c --> eed77b9eea541e0c378c67395351099c\r\na6ed501edbf561fda49a0a0a3ca310f0(seed<br>git_repo_ssh_key)\r\na6ed501edbf561fda49a0a0a3ca310f0 --> 8b5928cd265dd2c44d67d076f60c8b05\r\n8e39b501b41c5d0e4596318f80a03210 --> 6a44de06a4a3518b939b27c790f6cdce\r\n4e1d5ea96e050e46ebf95ebc0713d54c --> 181f1b33df4d795fbad2911ec7087e86\r\nend\r\n```\r\n\r\n- As of f8619a6362251d04929f4bfa395882b3257a3776 it works without meta issue\r\n creation: https://github.com/pdxjohnny/testaaaa/pull/193\r\n\r\n# 45\r\n\r\n```console\r\n$ gif-for-cli --rows $(tput lines) --cols $(tput cols) --export=/mnt/c/Users/Johnny/Downloads/alice-search-alices-adventures-in-wonderland-1.gif \"Alice's Adventures in Wonderland\"\r\n```",
"editedAt": "2022-07-29T05:03:18Z"
},
{
"diff": "## 2022-07-28 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n- Noticed that we have an issue with adding new files and locking. The current\r\n lock is on the `git_repository/GitRepoSpec`.\r\n - We then convert to `AliceGitRepo`, at which point anything take `AliceGitRepo`\r\n - alice: please: contribute: recommended community standards: Refactoring into overlays associated with each file contributed\r\n - Completed in 1a71dbe3ab3743430ce2783f4210a6cd807c36a1\r\n\r\n### 43\r\n\r\n```\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed.append(dffml.Input(value=repo, definition=definition, origin=('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result')))\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed\r\n[Input(value=origin, definition=writable.github.remote.origin), Input(value=master, definition=repo.git.base.branch), Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-hxnacg5_', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]\r\n```\r\n\r\n- Attempting to figure out why an operation is not being called\r\n - `contribute_readme_md` should be getting `base`, but is not.\r\n\r\n```\r\n{'_': {ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]},\r\n 'alternate_definitions': [],\r\n 'by_origin': {('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]},\r\n 'check_for_default_value': [repo.git.base.branch],\r\n 'contexts': [MemoryInputNetworkContextEntry(ctx=Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo), definitions={ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)], repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)], ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]}, by_origin={('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]})],\r\n 'ctx': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'dataflow': <dffml.df.types.DataFlow object at 0x7f177ec3b7f0>,\r\n 'definition': repo.git.base.branch,\r\n 'gather': {'base': [],\r\n 'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]},\r\n 'handle_string': \"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', \"\r\n \"URL='https://github.com/pdxjohnny/testaaaa'), \"\r\n 'definition=ReadmeGitRepo)',\r\n 'input_flow': InputFlow(inputs={'repo': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme': 'result'}], 'base': ['seed'], 'commit_message': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message': 'result'}]}, conditions=[]),\r\n 'input_name': 'base',\r\n 'input_source': 'seed',\r\n 'input_sources': ['seed'],\r\n 'item': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'operation': Operation(name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', inputs={'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}, outputs={'result': repo.readme.git.branch}, stage=<Stage.PROCESSING: 'processing'>, conditions=[], expand=[], instance_name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', validator=False, retry=0),\r\n 'origin': 'seed',\r\n 'origins': ['seed'],\r\n 'pprint': <function pprint at 0x7f1782d94670>,\r\n 'rctx': <dffml.df.memory.MemoryRedundancyCheckerContext object at 0x7f177ec5e220>,\r\n 'self': <dffml.df.memory.MemoryInputNetworkContext object at 0x7f177ec3ba30>}\r\n> /home/pdxjohnny/Documents/python/dffml/dffml/df/memory.py(788)gather_inputs()\r\n-> return\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n(Pdb) self.ctxhd.keys()\r\ndict_keys([\"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)\"])\r\n(Pdb) from pprint import pprint\r\n(Pdb) pprint(inputs.definitions)\r\n{ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]}\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n```\r\n\r\n- Suspect discarded because of mismatched origin, if not that, will check definition\r\n - Found out that it was seed vs. output origin mismatch\r\n - Found out that BaseBranch comes from OverlayGit\r\n - Registered OverlayGit as an overlay of OverlayReadme to that it's definitions get loaded\r\n - This way `auto_flow` will make the expected origin the output from OverlayGit operations\r\n rather than seed (the default when no matching outputs are seen on DataFlow init).\r\n - We found it created an infinite loop\r\n - Will try reusing redundancy checker, that seems to be doing well\r\n- https://github.com/intel/dffml/issues/1408\r\n- Now debugging why `readme_pr` not called, OverlayGit definitions were seen earlier\r\n on subflow start to be present, must be something else.\r\n - The logs tell us that alice_contribute_readme is returning `None`, which means\r\n that the downstream operation is not called, since None means no return value\r\n in this case.\r\n\r\n```\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme Outputs: None\r\n```\r\n\r\n- Future\r\n - `run_custom` Optionally support forward subflow\r\n- TODO\r\n - [ ] Set definition proprety `AliceGitRepo.lock` to `True`\r\n\r\n\r\n### 44\r\n\r\n- Found out that util: subprocess: run command events: Do not return after yield of stdout/err\r\n - Fixed in b6eea6ed4549f9e7a89aab6306a51213b2bf36c9\r\n\r\n```console\r\n$ (for i in $(echo determin_base_branch readme_pr_body contribute_readme_md github_owns_remote alice_contribute_readme); do grep -rn \"${i} Outputs\" .output/2022-07-28-14-11.txt; done) | sort | uniq | sort\r\n354:DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote Outputs: {'result': 'origin'}\r\n361:DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch Outputs: {'result': 'master'}\r\n450:DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body Outputs: {'result': 'Closes: https://github.com/pdxjohnny/testaaaa/issues/188'}\r\n472:DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md Outputs: {'result': 'alice-contribute-recommended-community-standards-readme'}\r\n479:DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme Outputs: None\r\n```\r\n\r\n```\r\n(Pdb)\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md Outputs: {'result': 'alice-contribute-recommended-community-standards-readme'}\r\n(Pdb) pprint(readme_dataflow.flow['alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr'].inputs)\r\n{'base': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch': 'result'}],\r\n 'body': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body': 'result'}],\r\n 'head': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md': 'result'}],\r\n 'origin': ['seed'],\r\n 'repo': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme': 'result'}],\r\n 'title': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_title': 'result'}]}\r\n```\r\n\r\n- origin is set to seed\r\n - `'origin': ['seed']` was there because `OverlayGitHub.github_owns_remote` is not in the flow\r\n - We forgot add it to `entry_points.txt`, added\r\n\r\n```console\r\n$ dffml service dev export alice.cli:AlicePleaseContributeCLIDataFlow | tee alice.please.contribute.recommended_community_standards.json\r\n$ (echo -e 'HTTP/1.0 200 OK\\n' && dffml dataflow diagram -shortname alice.please.contribute.recommended_community_standards.json) | nc -Nlp 9999;\r\n```\r\n\r\n- Opens\r\n - `guessed_repo_string_means_no_git_branch_given` is feeding `git_repo_default_branch` but `dffml dataflow diagram` just have a bug because it's not showing the connection.\r\n\r\n```mermaid\r\ngraph TD\r\nsubgraph a759a07029077edc5c37fea0326fa281[Processing Stage]\r\nstyle a759a07029077edc5c37fea0326fa281 fill:#afd388b5,stroke:#a4ca7a\r\nsubgraph 8cfb8cd5b8620de4a7ebe0dfec00771a[cli_has_repos]\r\nstyle 8cfb8cd5b8620de4a7ebe0dfec00771a fill:#fff4de,stroke:#cece71\r\nd493c90433d19f11f33c2d72cd144940[cli_has_repos]\r\ne07552ee3b6b7696cb3ddd786222eaad(cmd)\r\ne07552ee3b6b7696cb3ddd786222eaad --> d493c90433d19f11f33c2d72cd144940\r\ncee6b5fdd0b6fbd0539cdcdc7f5a3324(wanted)\r\ncee6b5fdd0b6fbd0539cdcdc7f5a3324 --> d493c90433d19f11f33c2d72cd144940\r\n79e1ea6822bff603a835fb8ee80c7ff3(result)\r\nd493c90433d19f11f33c2d72cd144940 --> 79e1ea6822bff603a835fb8ee80c7ff3\r\nend\r\nsubgraph 0c2b64320fb5666a034794bb2195ecf0[cli_is_asking_for_recommended_community_standards]\r\nstyle 0c2b64320fb5666a034794bb2195ecf0 fill:#fff4de,stroke:#cece71\r\n222ee6c0209f1f1b7a782bc1276868c7[cli_is_asking_for_recommended_community_standards]\r\n330f463830aa97e88917d5a9d1c21500(cmd)\r\n330f463830aa97e88917d5a9d1c21500 --> 222ee6c0209f1f1b7a782bc1276868c7\r\nba29b52e9c5aa88ea1caeeff29bfd491(result)\r\n222ee6c0209f1f1b7a782bc1276868c7 --> ba29b52e9c5aa88ea1caeeff29bfd491\r\nend\r\nsubgraph eac58e8db2b55cb9cc5474aaa402c93e[cli_is_meant_on_this_repo]\r\nstyle eac58e8db2b55cb9cc5474aaa402c93e fill:#fff4de,stroke:#cece71\r\n6c819ad0228b0e7094b33e0634da9a38[cli_is_meant_on_this_repo]\r\ndc7c5f0836f7d2564c402bf956722672(cmd)\r\ndc7c5f0836f7d2564c402bf956722672 --> 6c819ad0228b0e7094b33e0634da9a38\r\n58d8518cb0d6ef6ad35dc242486f1beb(wanted)\r\n58d8518cb0d6ef6ad35dc242486f1beb --> 6c819ad0228b0e7094b33e0634da9a38\r\n135ee61e3402d6fcbd7a219b0b4ccd73(result)\r\n6c819ad0228b0e7094b33e0634da9a38 --> 135ee61e3402d6fcbd7a219b0b4ccd73\r\nend\r\nsubgraph 37887bf260c5c8e9bd18038401008bbc[cli_run_on_repo]\r\nstyle 37887bf260c5c8e9bd18038401008bbc fill:#fff4de,stroke:#cece71\r\n9d1042f33352800e54d98c9c5a4223df[cli_run_on_repo]\r\ne824ae072860bc545fc7d55aa0bca479(repo)\r\ne824ae072860bc545fc7d55aa0bca479 --> 9d1042f33352800e54d98c9c5a4223df\r\n40109d487bb9f08608d8c5f6e747042f(result)\r\n9d1042f33352800e54d98c9c5a4223df --> 40109d487bb9f08608d8c5f6e747042f\r\nend\r\nsubgraph 66ecd0c1f2e08941c443ec9cd89ec589[guess_repo_string_is_directory]\r\nstyle 66ecd0c1f2e08941c443ec9cd89ec589 fill:#fff4de,stroke:#cece71\r\n737d719a0c348ff65456024ddbc530fe[guess_repo_string_is_directory]\r\n33d806f9b732bfd6b96ae2e9e4243a68(repo_string)\r\n33d806f9b732bfd6b96ae2e9e4243a68 --> 737d719a0c348ff65456024ddbc530fe\r\ndd5aab190ce844673819298c5b8fde76(result)\r\n737d719a0c348ff65456024ddbc530fe --> dd5aab190ce844673819298c5b8fde76\r\nend\r\nsubgraph 4ea6696419c4a0862a4f63ea1f60c751[create_branch_if_none_exists]\r\nstyle 4ea6696419c4a0862a4f63ea1f60c751 fill:#fff4de,stroke:#cece71\r\n502369b37882b300d6620d5b4020f5b2[create_branch_if_none_exists]\r\nfdcb9b6113856222e30e093f7c38065e(name)\r\nfdcb9b6113856222e30e093f7c38065e --> 502369b37882b300d6620d5b4020f5b2\r\nbdcf4b078985f4a390e4ed4beacffa65(repo)\r\nbdcf4b078985f4a390e4ed4beacffa65 --> 502369b37882b300d6620d5b4020f5b2\r\n5a5493ab86ab4053f1d44302e7bdddd6(result)\r\n502369b37882b300d6620d5b4020f5b2 --> 5a5493ab86ab4053f1d44302e7bdddd6\r\nend\r\nsubgraph b1d510183f6a4c3fde207a4656c72cb4[determin_base_branch]\r\nstyle b1d510183f6a4c3fde207a4656c72cb4 fill:#fff4de,stroke:#cece71\r\n476aecd4d4d712cda1879feba46ea109[determin_base_branch]\r\nff47cf65b58262acec28507f4427de45(default_branch)\r\nff47cf65b58262acec28507f4427de45 --> 476aecd4d4d712cda1879feba46ea109\r\n150204cd2d5a921deb53c312418379a1(result)\r\n476aecd4d4d712cda1879feba46ea109 --> 150204cd2d5a921deb53c312418379a1\r\nend\r\nsubgraph 2a08ff341f159c170b7fe017eaad2f18[git_repo_to_alice_git_repo]\r\nstyle 2a08ff341f159c170b7fe017eaad2f18 fill:#fff4de,stroke:#cece71\r\n7f74112f6d30c6289caa0a000e87edab[git_repo_to_alice_git_repo]\r\ne58180baf478fe910359358a3fa02234(repo)\r\ne58180baf478fe910359358a3fa02234 --> 7f74112f6d30c6289caa0a000e87edab\r\n9b92d5a346885079a2821c4d27cb5174(result)\r\n7f74112f6d30c6289caa0a000e87edab --> 9b92d5a346885079a2821c4d27cb5174\r\nend\r\nsubgraph b5d35aa8a8dcd28d22d47caad02676b0[guess_repo_string_is_url]\r\nstyle b5d35aa8a8dcd28d22d47caad02676b0 fill:#fff4de,stroke:#cece71\r\n0de074e71a32e30889b8bb400cf8db9f[guess_repo_string_is_url]\r\nc3bfe79b396a98ce2d9bfe772c9c20af(repo_string)\r\nc3bfe79b396a98ce2d9bfe772c9c20af --> 0de074e71a32e30889b8bb400cf8db9f\r\n2a1c620b0d510c3d8ed35deda41851c5(result)\r\n0de074e71a32e30889b8bb400cf8db9f --> 2a1c620b0d510c3d8ed35deda41851c5\r\nend\r\nsubgraph 60791520c6d124c0bf15e599132b0caf[guessed_repo_string_is_operations_git_url]\r\nstyle 60791520c6d124c0bf15e599132b0caf fill:#fff4de,stroke:#cece71\r\n102f173505d7b546236cdeff191369d4[guessed_repo_string_is_operations_git_url]\r\n4934c6211334318c63a5e91530171c9b(repo_url)\r\n4934c6211334318c63a5e91530171c9b --> 102f173505d7b546236cdeff191369d4\r\n8d0adc31da1a0919724baf73d047743c(result)\r\n102f173505d7b546236cdeff191369d4 --> 8d0adc31da1a0919724baf73d047743c\r\nend\r\nsubgraph f2c7b93622447999daab403713239ada[guessed_repo_string_means_no_git_branch_given]\r\nstyle f2c7b93622447999daab403713239ada fill:#fff4de,stroke:#cece71\r\nc8294a87e7aae8f7f9cb7f53e054fed5[guessed_repo_string_means_no_git_branch_given]\r\n5567dd8a6d7ae4fe86252db32e189a4d(repo_url)\r\n5567dd8a6d7ae4fe86252db32e189a4d --> c8294a87e7aae8f7f9cb7f53e054fed5\r\nd888e6b64b5e3496056088f14dab9894(result)\r\nc8294a87e7aae8f7f9cb7f53e054fed5 --> d888e6b64b5e3496056088f14dab9894\r\nend\r\nsubgraph 113addf4beee5305fdc79d2363608f9d[github_owns_remote]\r\nstyle 113addf4beee5305fdc79d2363608f9d fill:#fff4de,stroke:#cece71\r\n049b72b81b976fbb43607bfeeb0464c5[github_owns_remote]\r\n6c2b36393ffff6be0b4ad333df2d9419(remote)\r\n6c2b36393ffff6be0b4ad333df2d9419 --> 049b72b81b976fbb43607bfeeb0464c5\r\n19a9ee483c1743e6ecf0a2dc3b6f8c7a(repo)\r\n19a9ee483c1743e6ecf0a2dc3b6f8c7a --> 049b72b81b976fbb43607bfeeb0464c5\r\nb4cff8d194413f436d94f9d84ece0262(result)\r\n049b72b81b976fbb43607bfeeb0464c5 --> b4cff8d194413f436d94f9d84ece0262\r\nend\r\nsubgraph 43a22312a3d4f5c995c54c5196acc50a[create_meta_issue]\r\nstyle 43a22312a3d4f5c995c54c5196acc50a fill:#fff4de,stroke:#cece71\r\nd2345f23e5ef9f54c591c4a687c24575[create_meta_issue]\r\n1d79010ee1550f057c531130814c40b9(body)\r\n1d79010ee1550f057c531130814c40b9 --> d2345f23e5ef9f54c591c4a687c24575\r\n712d4318e59bd2dc629f0ddebb257ca3(repo)\r\n712d4318e59bd2dc629f0ddebb257ca3 --> d2345f23e5ef9f54c591c4a687c24575\r\n38a94f1c2162803f571489d707d61021(title)\r\n38a94f1c2162803f571489d707d61021 --> d2345f23e5ef9f54c591c4a687c24575\r\n2b22b4998ac3e6a64d82e0147e71ee1b(result)\r\nd2345f23e5ef9f54c591c4a687c24575 --> 2b22b4998ac3e6a64d82e0147e71ee1b\r\nend\r\nsubgraph f77af509c413b86b6cd7e107cc623c73[meta_issue_body]\r\nstyle f77af509c413b86b6cd7e107cc623c73 fill:#fff4de,stroke:#cece71\r\n69a9852570720a3d35cb9dd52a281f71[meta_issue_body]\r\n480d1cc478d23858e92d61225349b674(base)\r\n480d1cc478d23858e92d61225349b674 --> 69a9852570720a3d35cb9dd52a281f71\r\n37035ea5a06a282bdc1e1de24090a36d(readme_issue)\r\n37035ea5a06a282bdc1e1de24090a36d --> 69a9852570720a3d35cb9dd52a281f71\r\nfdf0dbb8ca47ee9022b3daeb8c7df9c0(readme_path)\r\nfdf0dbb8ca47ee9022b3daeb8c7df9c0 --> 69a9852570720a3d35cb9dd52a281f71\r\n428ca84f627c695362652cc7531fc27b(repo)\r\n428ca84f627c695362652cc7531fc27b --> 69a9852570720a3d35cb9dd52a281f71\r\n0cd9eb1ffb3c56d2b0a4359f800b1f20(result)\r\n69a9852570720a3d35cb9dd52a281f71 --> 0cd9eb1ffb3c56d2b0a4359f800b1f20\r\nend\r\nsubgraph 8506cba6514466fb6d65f33ace4b0eac[alice_contribute_readme]\r\nstyle 8506cba6514466fb6d65f33ace4b0eac fill:#fff4de,stroke:#cece71\r\nd4507d3d1c3fbf3e7e373eae24797667[alice_contribute_readme]\r\n68cf7d6869d027ca46a5fb4dbf7001d1(repo)\r\n68cf7d6869d027ca46a5fb4dbf7001d1 --> d4507d3d1c3fbf3e7e373eae24797667\r\n2f9316539862f119f7c525bf9061e974(result)\r\nd4507d3d1c3fbf3e7e373eae24797667 --> 2f9316539862f119f7c525bf9061e974\r\nend\r\nsubgraph 4233e6dc67cba131d4ef005af9c02959[contribute_readme_md]\r\nstyle 4233e6dc67cba131d4ef005af9c02959 fill:#fff4de,stroke:#cece71\r\n3db0ee5d6ab83886bded5afd86f3f88f[contribute_readme_md]\r\n37044e4d8610abe13849bc71a5cb7591(base)\r\n37044e4d8610abe13849bc71a5cb7591 --> 3db0ee5d6ab83886bded5afd86f3f88f\r\n631c051fe6050ae8f8fc3321ed00802d(commit_message)\r\n631c051fe6050ae8f8fc3321ed00802d --> 3db0ee5d6ab83886bded5afd86f3f88f\r\n182194bab776fc9bc406ed573d621b68(repo)\r\n182194bab776fc9bc406ed573d621b68 --> 3db0ee5d6ab83886bded5afd86f3f88f\r\n0ee9f524d2db12be854fe611fa8126dd(result)\r\n3db0ee5d6ab83886bded5afd86f3f88f --> 0ee9f524d2db12be854fe611fa8126dd\r\nend\r\nsubgraph a6080d9c45eb5f806a47152a18bf7830[create_readme_file_if_not_exists]\r\nstyle a6080d9c45eb5f806a47152a18bf7830 fill:#fff4de,stroke:#cece71\r\n67e388f508dd96084c37d236a2c67e67[create_readme_file_if_not_exists]\r\n54faf20bfdca0e63d07efb3e5a984cf1(readme_contents)\r\n54faf20bfdca0e63d07efb3e5a984cf1 --> 67e388f508dd96084c37d236a2c67e67\r\n8c089c362960ccf181742334a3dccaea(repo)\r\n8c089c362960ccf181742334a3dccaea --> 67e388f508dd96084c37d236a2c67e67\r\n5cc65e17d40e6a7223c1504f1c4b0d2a(result)\r\n67e388f508dd96084c37d236a2c67e67 --> 5cc65e17d40e6a7223c1504f1c4b0d2a\r\nend\r\nsubgraph e7757158127e9845b2915c16a7fa80c5[readme_commit_message]\r\nstyle e7757158127e9845b2915c16a7fa80c5 fill:#fff4de,stroke:#cece71\r\n562bdc535c7cebfc66dba920b1a17540[readme_commit_message]\r\n0af5cbea9050874a0a3cba73bb61f892(issue_url)\r\n0af5cbea9050874a0a3cba73bb61f892 --> 562bdc535c7cebfc66dba920b1a17540\r\n2641f3b67327fb7518ee34a3a40b0755(result)\r\n562bdc535c7cebfc66dba920b1a17540 --> 2641f3b67327fb7518ee34a3a40b0755\r\nend\r\nsubgraph cf99ff6fad80e9c21266b43fd67b2f7b[readme_issue]\r\nstyle cf99ff6fad80e9c21266b43fd67b2f7b fill:#fff4de,stroke:#cece71\r\nda44417f891a945085590baafffc2bdb[readme_issue]\r\nd519830ab4e07ec391038e8581889ac3(body)\r\nd519830ab4e07ec391038e8581889ac3 --> da44417f891a945085590baafffc2bdb\r\n268852aa3fa8ab0864a32abae5a333f7(repo)\r\n268852aa3fa8ab0864a32abae5a333f7 --> da44417f891a945085590baafffc2bdb\r\n77a11dd29af309cf43ed321446c4bf01(title)\r\n77a11dd29af309cf43ed321446c4bf01 --> da44417f891a945085590baafffc2bdb\r\n1d2360c9da18fac0b6ec142df8f3fbda(result)\r\nda44417f891a945085590baafffc2bdb --> 1d2360c9da18fac0b6ec142df8f3fbda\r\nend\r\nsubgraph 7ec0442cf2d95c367912e8abee09b217[readme_pr]\r\nstyle 7ec0442cf2d95c367912e8abee09b217 fill:#fff4de,stroke:#cece71\r\nbb314dc452cde5b6af5ea94dd277ba40[readme_pr]\r\n127d77c3047facc1daa621148c5a0a1d(base)\r\n127d77c3047facc1daa621148c5a0a1d --> bb314dc452cde5b6af5ea94dd277ba40\r\ncb421e4de153cbb912f7fbe57e4ad734(body)\r\ncb421e4de153cbb912f7fbe57e4ad734 --> bb314dc452cde5b6af5ea94dd277ba40\r\ncbf7a0b88c0a41953b245303f3e9a0d3(head)\r\ncbf7a0b88c0a41953b245303f3e9a0d3 --> bb314dc452cde5b6af5ea94dd277ba40\r\ne5f9ad44448abd2469b3fd9831f3d159(origin)\r\ne5f9ad44448abd2469b3fd9831f3d159 --> bb314dc452cde5b6af5ea94dd277ba40\r\na35aee6711d240378eb57a3932537ca1(repo)\r\na35aee6711d240378eb57a3932537ca1 --> bb314dc452cde5b6af5ea94dd277ba40\r\ndfcce88a7d605d46bf17de1159fbe5ad(title)\r\ndfcce88a7d605d46bf17de1159fbe5ad --> bb314dc452cde5b6af5ea94dd277ba40\r\na210a7890a7bea8d629368e02da3d806(result)\r\nbb314dc452cde5b6af5ea94dd277ba40 --> a210a7890a7bea8d629368e02da3d806\r\nend\r\nsubgraph 227eabb1f1c5cc0bc931714a03049e27[readme_pr_body]\r\nstyle 227eabb1f1c5cc0bc931714a03049e27 fill:#fff4de,stroke:#cece71\r\n2aea976396cfe68dacd9bc7d4a3f0cba[readme_pr_body]\r\nc5dfd309617c909b852afe0b4ae4a178(readme_issue)\r\nc5dfd309617c909b852afe0b4ae4a178 --> 2aea976396cfe68dacd9bc7d4a3f0cba\r\n40ddb5b508cb5643e7c91f7abdb72b84(result)\r\n2aea976396cfe68dacd9bc7d4a3f0cba --> 40ddb5b508cb5643e7c91f7abdb72b84\r\nend\r\nsubgraph 48687c84e69b3db0acca625cbe2e6b49[readme_pr_title]\r\nstyle 48687c84e69b3db0acca625cbe2e6b49 fill:#fff4de,stroke:#cece71\r\nd8668ff93f41bc241c8c540199cd7453[readme_pr_title]\r\n3b2137dd1c61d0dac7d4e40fd6746cfb(readme_issue)\r\n3b2137dd1c61d0dac7d4e40fd6746cfb --> d8668ff93f41bc241c8c540199cd7453\r\n956e024fde513b3a449eac9ee42d6ab3(result)\r\nd8668ff93f41bc241c8c540199cd7453 --> 956e024fde513b3a449eac9ee42d6ab3\r\nend\r\nsubgraph d3ec0ac85209a7256c89d20f758f09f4[check_if_valid_git_repository_URL]\r\nstyle d3ec0ac85209a7256c89d20f758f09f4 fill:#fff4de,stroke:#cece71\r\nf577c71443f6b04596b3fe0511326c40[check_if_valid_git_repository_URL]\r\n7440e73a8e8f864097f42162b74f2762(URL)\r\n7440e73a8e8f864097f42162b74f2762 --> f577c71443f6b04596b3fe0511326c40\r\n8e39b501b41c5d0e4596318f80a03210(valid)\r\nf577c71443f6b04596b3fe0511326c40 --> 8e39b501b41c5d0e4596318f80a03210\r\nend\r\nsubgraph af8da22d1318d911f29b95e687f87c5d[clone_git_repo]\r\nstyle af8da22d1318d911f29b95e687f87c5d fill:#fff4de,stroke:#cece71\r\n155b8fdb5524f6bfd5adbae4940ad8d5[clone_git_repo]\r\need77b9eea541e0c378c67395351099c(URL)\r\need77b9eea541e0c378c67395351099c --> 155b8fdb5524f6bfd5adbae4940ad8d5\r\n8b5928cd265dd2c44d67d076f60c8b05(ssh_key)\r\n8b5928cd265dd2c44d67d076f60c8b05 --> 155b8fdb5524f6bfd5adbae4940ad8d5\r\n4e1d5ea96e050e46ebf95ebc0713d54c(repo)\r\n155b8fdb5524f6bfd5adbae4940ad8d5 --> 4e1d5ea96e050e46ebf95ebc0713d54c\r\n6a44de06a4a3518b939b27c790f6cdce{valid_git_repository_URL}\r\n6a44de06a4a3518b939b27c790f6cdce --> 155b8fdb5524f6bfd5adbae4940ad8d5\r\nend\r\nsubgraph d3d91578caf34c0ae944b17853783406[git_repo_default_branch]\r\nstyle d3d91578caf34c0ae944b17853783406 fill:#fff4de,stroke:#cece71\r\n546062a96122df465d2631f31df4e9e3[git_repo_default_branch]\r\n181f1b33df4d795fbad2911ec7087e86(repo)\r\n181f1b33df4d795fbad2911ec7087e86 --> 546062a96122df465d2631f31df4e9e3\r\n57651c1bcd24b794dfc8d1794ab556d5(branch)\r\n546062a96122df465d2631f31df4e9e3 --> 57651c1bcd24b794dfc8d1794ab556d5\r\n5ed1ab77e726d7efdcc41e9e2f8039c6(remote)\r\n546062a96122df465d2631f31df4e9e3 --> 5ed1ab77e726d7efdcc41e9e2f8039c6\r\n4c3cdd5f15b7a846d291aac089e8a622{no_git_branch_given}\r\n4c3cdd5f15b7a846d291aac089e8a622 --> 546062a96122df465d2631f31df4e9e3\r\nend\r\nend\r\nsubgraph a4827add25f5c7d5895c5728b74e2beb[Cleanup Stage]\r\nstyle a4827add25f5c7d5895c5728b74e2beb fill:#afd388b5,stroke:#a4ca7a\r\nend\r\nsubgraph 58ca4d24d2767176f196436c2890b926[Output Stage]\r\nstyle 58ca4d24d2767176f196436c2890b926 fill:#afd388b5,stroke:#a4ca7a\r\nend\r\nsubgraph inputs[Inputs]\r\nstyle inputs fill:#f6dbf9,stroke:#a178ca\r\n128516cfa09b0383023eab52ee24878a(seed<br>dffml.util.cli.CMD)\r\n128516cfa09b0383023eab52ee24878a --> e07552ee3b6b7696cb3ddd786222eaad\r\nba29b52e9c5aa88ea1caeeff29bfd491 --> cee6b5fdd0b6fbd0539cdcdc7f5a3324\r\n128516cfa09b0383023eab52ee24878a(seed<br>dffml.util.cli.CMD)\r\n128516cfa09b0383023eab52ee24878a --> 330f463830aa97e88917d5a9d1c21500\r\n128516cfa09b0383023eab52ee24878a(seed<br>dffml.util.cli.CMD)\r\n128516cfa09b0383023eab52ee24878a --> dc7c5f0836f7d2564c402bf956722672\r\nba29b52e9c5aa88ea1caeeff29bfd491 --> 58d8518cb0d6ef6ad35dc242486f1beb\r\n79e1ea6822bff603a835fb8ee80c7ff3 --> e824ae072860bc545fc7d55aa0bca479\r\n135ee61e3402d6fcbd7a219b0b4ccd73 --> e824ae072860bc545fc7d55aa0bca479\r\n40109d487bb9f08608d8c5f6e747042f --> 33d806f9b732bfd6b96ae2e9e4243a68\r\n21ccfd2c550bd853d28581f0b0c9f9fe(seed<br>default.branch.name)\r\n21ccfd2c550bd853d28581f0b0c9f9fe --> fdcb9b6113856222e30e093f7c38065e\r\ndd5aab190ce844673819298c5b8fde76 --> bdcf4b078985f4a390e4ed4beacffa65\r\n9b92d5a346885079a2821c4d27cb5174 --> bdcf4b078985f4a390e4ed4beacffa65\r\n5a5493ab86ab4053f1d44302e7bdddd6 --> ff47cf65b58262acec28507f4427de45\r\n57651c1bcd24b794dfc8d1794ab556d5 --> ff47cf65b58262acec28507f4427de45\r\n4e1d5ea96e050e46ebf95ebc0713d54c --> e58180baf478fe910359358a3fa02234\r\n40109d487bb9f08608d8c5f6e747042f --> c3bfe79b396a98ce2d9bfe772c9c20af\r\n2a1c620b0d510c3d8ed35deda41851c5 --> 4934c6211334318c63a5e91530171c9b\r\n2a1c620b0d510c3d8ed35deda41851c5 --> 5567dd8a6d7ae4fe86252db32e189a4d\r\n5ed1ab77e726d7efdcc41e9e2f8039c6 --> 6c2b36393ffff6be0b4ad333df2d9419\r\ndd5aab190ce844673819298c5b8fde76 --> 19a9ee483c1743e6ecf0a2dc3b6f8c7a\r\n9b92d5a346885079a2821c4d27cb5174 --> 19a9ee483c1743e6ecf0a2dc3b6f8c7a\r\n0cd9eb1ffb3c56d2b0a4359f800b1f20 --> 1d79010ee1550f057c531130814c40b9\r\ndd5aab190ce844673819298c5b8fde76 --> 712d4318e59bd2dc629f0ddebb257ca3\r\n9b92d5a346885079a2821c4d27cb5174 --> 712d4318e59bd2dc629f0ddebb257ca3\r\ne7ad3469d98c3bd160363dbc47e2d741(seed<br>MetaIssueTitle)\r\ne7ad3469d98c3bd160363dbc47e2d741 --> 38a94f1c2162803f571489d707d61021\r\n150204cd2d5a921deb53c312418379a1 --> 480d1cc478d23858e92d61225349b674\r\n1d2360c9da18fac0b6ec142df8f3fbda --> 37035ea5a06a282bdc1e1de24090a36d\r\n5cc65e17d40e6a7223c1504f1c4b0d2a --> fdf0dbb8ca47ee9022b3daeb8c7df9c0\r\ndd5aab190ce844673819298c5b8fde76 --> 428ca84f627c695362652cc7531fc27b\r\n9b92d5a346885079a2821c4d27cb5174 --> 428ca84f627c695362652cc7531fc27b\r\ndd5aab190ce844673819298c5b8fde76 --> 68cf7d6869d027ca46a5fb4dbf7001d1\r\n9b92d5a346885079a2821c4d27cb5174 --> 68cf7d6869d027ca46a5fb4dbf7001d1\r\n150204cd2d5a921deb53c312418379a1 --> 37044e4d8610abe13849bc71a5cb7591\r\n2641f3b67327fb7518ee34a3a40b0755 --> 631c051fe6050ae8f8fc3321ed00802d\r\n2f9316539862f119f7c525bf9061e974 --> 182194bab776fc9bc406ed573d621b68\r\nd2708225c1f4c95d613a2645a17a5bc0(seed<br>repo.directory.readme.contents)\r\nd2708225c1f4c95d613a2645a17a5bc0 --> 54faf20bfdca0e63d07efb3e5a984cf1\r\n2f9316539862f119f7c525bf9061e974 --> 8c089c362960ccf181742334a3dccaea\r\n1d2360c9da18fac0b6ec142df8f3fbda --> 0af5cbea9050874a0a3cba73bb61f892\r\n1daacccd02f8117e67ad3cb8686a732c(seed<br>ReadmeIssueBody)\r\n1daacccd02f8117e67ad3cb8686a732c --> d519830ab4e07ec391038e8581889ac3\r\n2f9316539862f119f7c525bf9061e974 --> 268852aa3fa8ab0864a32abae5a333f7\r\n0c1ab2d4bda10e1083557833ae5c5da4(seed<br>ReadmeIssueTitle)\r\n0c1ab2d4bda10e1083557833ae5c5da4 --> 77a11dd29af309cf43ed321446c4bf01\r\n150204cd2d5a921deb53c312418379a1 --> 127d77c3047facc1daa621148c5a0a1d\r\n40ddb5b508cb5643e7c91f7abdb72b84 --> cb421e4de153cbb912f7fbe57e4ad734\r\n0ee9f524d2db12be854fe611fa8126dd --> cbf7a0b88c0a41953b245303f3e9a0d3\r\nb4cff8d194413f436d94f9d84ece0262 --> e5f9ad44448abd2469b3fd9831f3d159\r\n2f9316539862f119f7c525bf9061e974 --> a35aee6711d240378eb57a3932537ca1\r\n956e024fde513b3a449eac9ee42d6ab3 --> dfcce88a7d605d46bf17de1159fbe5ad\r\n1d2360c9da18fac0b6ec142df8f3fbda --> c5dfd309617c909b852afe0b4ae4a178\r\n1d2360c9da18fac0b6ec142df8f3fbda --> 3b2137dd1c61d0dac7d4e40fd6746cfb\r\n8d0adc31da1a0919724baf73d047743c --> 7440e73a8e8f864097f42162b74f2762\r\n8d0adc31da1a0919724baf73d047743c --> eed77b9eea541e0c378c67395351099c\r\na6ed501edbf561fda49a0a0a3ca310f0(seed<br>git_repo_ssh_key)\r\na6ed501edbf561fda49a0a0a3ca310f0 --> 8b5928cd265dd2c44d67d076f60c8b05\r\n8e39b501b41c5d0e4596318f80a03210 --> 6a44de06a4a3518b939b27c790f6cdce\r\n4e1d5ea96e050e46ebf95ebc0713d54c --> 181f1b33df4d795fbad2911ec7087e86\r\nend\r\n```\r\n\r\n- As of f8619a6362251d04929f4bfa395882b3257a3776 it works without meta issue\r\n creation: https://github.com/pdxjohnny/testaaaa/pull/193",
"editedAt": "2022-07-28T21:34:06Z"
},
{
"diff": "## 2022-07-28 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n- Noticed that we have an issue with adding new files and locking. The current\r\n lock is on the `git_repository/GitRepoSpec`.\r\n - We then convert to `AliceGitRepo`, at which point anything take `AliceGitRepo`\r\n - alice: please: contribute: recommended community standards: Refactoring into overlays associated with each file contributed\r\n - Completed in 1a71dbe3ab3743430ce2783f4210a6cd807c36a1\r\n\r\n### 43\r\n\r\n```\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed.append(dffml.Input(value=repo, definition=definition, origin=('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result')))\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed\r\n[Input(value=origin, definition=writable.github.remote.origin), Input(value=master, definition=repo.git.base.branch), Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-hxnacg5_', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]\r\n```\r\n\r\n- Attempting to figure out why an operation is not being called\r\n - `contribute_readme_md` should be getting `base`, but is not.\r\n\r\n```\r\n{'_': {ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]},\r\n 'alternate_definitions': [],\r\n 'by_origin': {('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]},\r\n 'check_for_default_value': [repo.git.base.branch],\r\n 'contexts': [MemoryInputNetworkContextEntry(ctx=Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo), definitions={ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)], repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)], ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]}, by_origin={('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]})],\r\n 'ctx': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'dataflow': <dffml.df.types.DataFlow object at 0x7f177ec3b7f0>,\r\n 'definition': repo.git.base.branch,\r\n 'gather': {'base': [],\r\n 'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]},\r\n 'handle_string': \"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', \"\r\n \"URL='https://github.com/pdxjohnny/testaaaa'), \"\r\n 'definition=ReadmeGitRepo)',\r\n 'input_flow': InputFlow(inputs={'repo': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme': 'result'}], 'base': ['seed'], 'commit_message': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message': 'result'}]}, conditions=[]),\r\n 'input_name': 'base',\r\n 'input_source': 'seed',\r\n 'input_sources': ['seed'],\r\n 'item': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'operation': Operation(name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', inputs={'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}, outputs={'result': repo.readme.git.branch}, stage=<Stage.PROCESSING: 'processing'>, conditions=[], expand=[], instance_name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', validator=False, retry=0),\r\n 'origin': 'seed',\r\n 'origins': ['seed'],\r\n 'pprint': <function pprint at 0x7f1782d94670>,\r\n 'rctx': <dffml.df.memory.MemoryRedundancyCheckerContext object at 0x7f177ec5e220>,\r\n 'self': <dffml.df.memory.MemoryInputNetworkContext object at 0x7f177ec3ba30>}\r\n> /home/pdxjohnny/Documents/python/dffml/dffml/df/memory.py(788)gather_inputs()\r\n-> return\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n(Pdb) self.ctxhd.keys()\r\ndict_keys([\"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)\"])\r\n(Pdb) from pprint import pprint\r\n(Pdb) pprint(inputs.definitions)\r\n{ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]}\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n```\r\n\r\n- Suspect discarded because of mismatched origin, if not that, will check definition\r\n - Found out that it was seed vs. output origin mismatch\r\n - Found out that BaseBranch comes from OverlayGit\r\n - Registered OverlayGit as an overlay of OverlayReadme to that it's definitions get loaded\r\n - This way `auto_flow` will make the expected origin the output from OverlayGit operations\r\n rather than seed (the default when no matching outputs are seen on DataFlow init).\r\n - We found it created an infinite loop\r\n - Will try reusing redundancy checker, that seems to be doing well\r\n- https://github.com/intel/dffml/issues/1408\r\n- Now debugging why `readme_pr` not called, OverlayGit definitions were seen earlier\r\n on subflow start to be present, must be something else.\r\n - The logs tell us that alice_contribute_readme is returning `None`, which means\r\n that the downstream operation is not called, since None means no return value\r\n in this case.\r\n\r\n```\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme Outputs: None\r\n```\r\n\r\n- Future\r\n - `run_custom` Optionally support forward subflow\r\n- TODO\r\n - [ ] Set definition proprety `AliceGitRepo.lock` to `True`\r\n\r\n\r\n### 44\r\n\r\n- Found out that util: subprocess: run command events: Do not return after yield of stdout/err\r\n - Fixed in b6eea6ed4549f9e7a89aab6306a51213b2bf36c9\r\n\r\n```console\r\n$ (for i in $(echo determin_base_branch readme_pr_body contribute_readme_md github_owns_remote alice_contribute_readme); do grep -rn \"${i} Outputs\" .output/2022-07-28-14-11.txt; done) | sort | uniq | sort\r\n354:DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote Outputs: {'result': 'origin'}\r\n361:DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch Outputs: {'result': 'master'}\r\n450:DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body Outputs: {'result': 'Closes: https://github.com/pdxjohnny/testaaaa/issues/188'}\r\n472:DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md Outputs: {'result': 'alice-contribute-recommended-community-standards-readme'}\r\n479:DEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme Outputs: None\r\n```\r\n\r\n```\r\n(Pdb)\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md Outputs: {'result': 'alice-contribute-recommended-community-standards-readme'}\r\n(Pdb) pprint(readme_dataflow.flow['alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr'].inputs)\r\n{'base': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch': 'result'}],\r\n 'body': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body': 'result'}],\r\n 'head': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md': 'result'}],\r\n 'origin': ['seed'],\r\n 'repo': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme': 'result'}],\r\n 'title': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_title': 'result'}]}\r\n```\r\n\r\n- origin is set to seed\r\n - `'origin': ['seed']` was there because `OverlayGitHub.github_owns_remote` is not in the flow\r\n - We forgot add it to `entry_points.txt`, added\r\n\r\n```console\r\n$ dffml service dev export alice.cli:AlicePleaseContributeCLIDataFlow | tee alice.please.contribute.recommended_community_standards.json\r\n$ (echo -e 'HTTP/1.0 200 OK\\n' && dffml dataflow diagram -shortname alice.please.contribute.recommended_community_standards.json) | nc -Nlp 9999;\r\n```\r\n\r\n- Opens\r\n - `guessed_repo_string_means_no_git_branch_given` is feeding `git_repo_default_branch` but `dffml dataflow diagram` just have a bug because it's not showing the connection.\r\n\r\n```mermaid\r\ngraph TD\r\nsubgraph a759a07029077edc5c37fea0326fa281[Processing Stage]\r\nstyle a759a07029077edc5c37fea0326fa281 fill:#afd388b5,stroke:#a4ca7a\r\nsubgraph 8cfb8cd5b8620de4a7ebe0dfec00771a[cli_has_repos]\r\nstyle 8cfb8cd5b8620de4a7ebe0dfec00771a fill:#fff4de,stroke:#cece71\r\nd493c90433d19f11f33c2d72cd144940[cli_has_repos]\r\ne07552ee3b6b7696cb3ddd786222eaad(cmd)\r\ne07552ee3b6b7696cb3ddd786222eaad --> d493c90433d19f11f33c2d72cd144940\r\ncee6b5fdd0b6fbd0539cdcdc7f5a3324(wanted)\r\ncee6b5fdd0b6fbd0539cdcdc7f5a3324 --> d493c90433d19f11f33c2d72cd144940\r\n79e1ea6822bff603a835fb8ee80c7ff3(result)\r\nd493c90433d19f11f33c2d72cd144940 --> 79e1ea6822bff603a835fb8ee80c7ff3\r\nend\r\nsubgraph 0c2b64320fb5666a034794bb2195ecf0[cli_is_asking_for_recommended_community_standards]\r\nstyle 0c2b64320fb5666a034794bb2195ecf0 fill:#fff4de,stroke:#cece71\r\n222ee6c0209f1f1b7a782bc1276868c7[cli_is_asking_for_recommended_community_standards]\r\n330f463830aa97e88917d5a9d1c21500(cmd)\r\n330f463830aa97e88917d5a9d1c21500 --> 222ee6c0209f1f1b7a782bc1276868c7\r\nba29b52e9c5aa88ea1caeeff29bfd491(result)\r\n222ee6c0209f1f1b7a782bc1276868c7 --> ba29b52e9c5aa88ea1caeeff29bfd491\r\nend\r\nsubgraph eac58e8db2b55cb9cc5474aaa402c93e[cli_is_meant_on_this_repo]\r\nstyle eac58e8db2b55cb9cc5474aaa402c93e fill:#fff4de,stroke:#cece71\r\n6c819ad0228b0e7094b33e0634da9a38[cli_is_meant_on_this_repo]\r\ndc7c5f0836f7d2564c402bf956722672(cmd)\r\ndc7c5f0836f7d2564c402bf956722672 --> 6c819ad0228b0e7094b33e0634da9a38\r\n58d8518cb0d6ef6ad35dc242486f1beb(wanted)\r\n58d8518cb0d6ef6ad35dc242486f1beb --> 6c819ad0228b0e7094b33e0634da9a38\r\n135ee61e3402d6fcbd7a219b0b4ccd73(result)\r\n6c819ad0228b0e7094b33e0634da9a38 --> 135ee61e3402d6fcbd7a219b0b4ccd73\r\nend\r\nsubgraph 37887bf260c5c8e9bd18038401008bbc[cli_run_on_repo]\r\nstyle 37887bf260c5c8e9bd18038401008bbc fill:#fff4de,stroke:#cece71\r\n9d1042f33352800e54d98c9c5a4223df[cli_run_on_repo]\r\ne824ae072860bc545fc7d55aa0bca479(repo)\r\ne824ae072860bc545fc7d55aa0bca479 --> 9d1042f33352800e54d98c9c5a4223df\r\n40109d487bb9f08608d8c5f6e747042f(result)\r\n9d1042f33352800e54d98c9c5a4223df --> 40109d487bb9f08608d8c5f6e747042f\r\nend\r\nsubgraph 66ecd0c1f2e08941c443ec9cd89ec589[guess_repo_string_is_directory]\r\nstyle 66ecd0c1f2e08941c443ec9cd89ec589 fill:#fff4de,stroke:#cece71\r\n737d719a0c348ff65456024ddbc530fe[guess_repo_string_is_directory]\r\n33d806f9b732bfd6b96ae2e9e4243a68(repo_string)\r\n33d806f9b732bfd6b96ae2e9e4243a68 --> 737d719a0c348ff65456024ddbc530fe\r\ndd5aab190ce844673819298c5b8fde76(result)\r\n737d719a0c348ff65456024ddbc530fe --> dd5aab190ce844673819298c5b8fde76\r\nend\r\nsubgraph 4ea6696419c4a0862a4f63ea1f60c751[create_branch_if_none_exists]\r\nstyle 4ea6696419c4a0862a4f63ea1f60c751 fill:#fff4de,stroke:#cece71\r\n502369b37882b300d6620d5b4020f5b2[create_branch_if_none_exists]\r\nfdcb9b6113856222e30e093f7c38065e(name)\r\nfdcb9b6113856222e30e093f7c38065e --> 502369b37882b300d6620d5b4020f5b2\r\nbdcf4b078985f4a390e4ed4beacffa65(repo)\r\nbdcf4b078985f4a390e4ed4beacffa65 --> 502369b37882b300d6620d5b4020f5b2\r\n5a5493ab86ab4053f1d44302e7bdddd6(result)\r\n502369b37882b300d6620d5b4020f5b2 --> 5a5493ab86ab4053f1d44302e7bdddd6\r\nend\r\nsubgraph b1d510183f6a4c3fde207a4656c72cb4[determin_base_branch]\r\nstyle b1d510183f6a4c3fde207a4656c72cb4 fill:#fff4de,stroke:#cece71\r\n476aecd4d4d712cda1879feba46ea109[determin_base_branch]\r\nff47cf65b58262acec28507f4427de45(default_branch)\r\nff47cf65b58262acec28507f4427de45 --> 476aecd4d4d712cda1879feba46ea109\r\n150204cd2d5a921deb53c312418379a1(result)\r\n476aecd4d4d712cda1879feba46ea109 --> 150204cd2d5a921deb53c312418379a1\r\nend\r\nsubgraph 2a08ff341f159c170b7fe017eaad2f18[git_repo_to_alice_git_repo]\r\nstyle 2a08ff341f159c170b7fe017eaad2f18 fill:#fff4de,stroke:#cece71\r\n7f74112f6d30c6289caa0a000e87edab[git_repo_to_alice_git_repo]\r\ne58180baf478fe910359358a3fa02234(repo)\r\ne58180baf478fe910359358a3fa02234 --> 7f74112f6d30c6289caa0a000e87edab\r\n9b92d5a346885079a2821c4d27cb5174(result)\r\n7f74112f6d30c6289caa0a000e87edab --> 9b92d5a346885079a2821c4d27cb5174\r\nend\r\nsubgraph b5d35aa8a8dcd28d22d47caad02676b0[guess_repo_string_is_url]\r\nstyle b5d35aa8a8dcd28d22d47caad02676b0 fill:#fff4de,stroke:#cece71\r\n0de074e71a32e30889b8bb400cf8db9f[guess_repo_string_is_url]\r\nc3bfe79b396a98ce2d9bfe772c9c20af(repo_string)\r\nc3bfe79b396a98ce2d9bfe772c9c20af --> 0de074e71a32e30889b8bb400cf8db9f\r\n2a1c620b0d510c3d8ed35deda41851c5(result)\r\n0de074e71a32e30889b8bb400cf8db9f --> 2a1c620b0d510c3d8ed35deda41851c5\r\nend\r\nsubgraph 60791520c6d124c0bf15e599132b0caf[guessed_repo_string_is_operations_git_url]\r\nstyle 60791520c6d124c0bf15e599132b0caf fill:#fff4de,stroke:#cece71\r\n102f173505d7b546236cdeff191369d4[guessed_repo_string_is_operations_git_url]\r\n4934c6211334318c63a5e91530171c9b(repo_url)\r\n4934c6211334318c63a5e91530171c9b --> 102f173505d7b546236cdeff191369d4\r\n8d0adc31da1a0919724baf73d047743c(result)\r\n102f173505d7b546236cdeff191369d4 --> 8d0adc31da1a0919724baf73d047743c\r\nend\r\nsubgraph f2c7b93622447999daab403713239ada[guessed_repo_string_means_no_git_branch_given]\r\nstyle f2c7b93622447999daab403713239ada fill:#fff4de,stroke:#cece71\r\nc8294a87e7aae8f7f9cb7f53e054fed5[guessed_repo_string_means_no_git_branch_given]\r\n5567dd8a6d7ae4fe86252db32e189a4d(repo_url)\r\n5567dd8a6d7ae4fe86252db32e189a4d --> c8294a87e7aae8f7f9cb7f53e054fed5\r\nd888e6b64b5e3496056088f14dab9894(result)\r\nc8294a87e7aae8f7f9cb7f53e054fed5 --> d888e6b64b5e3496056088f14dab9894\r\nend\r\nsubgraph 113addf4beee5305fdc79d2363608f9d[github_owns_remote]\r\nstyle 113addf4beee5305fdc79d2363608f9d fill:#fff4de,stroke:#cece71\r\n049b72b81b976fbb43607bfeeb0464c5[github_owns_remote]\r\n6c2b36393ffff6be0b4ad333df2d9419(remote)\r\n6c2b36393ffff6be0b4ad333df2d9419 --> 049b72b81b976fbb43607bfeeb0464c5\r\n19a9ee483c1743e6ecf0a2dc3b6f8c7a(repo)\r\n19a9ee483c1743e6ecf0a2dc3b6f8c7a --> 049b72b81b976fbb43607bfeeb0464c5\r\nb4cff8d194413f436d94f9d84ece0262(result)\r\n049b72b81b976fbb43607bfeeb0464c5 --> b4cff8d194413f436d94f9d84ece0262\r\nend\r\nsubgraph 43a22312a3d4f5c995c54c5196acc50a[create_meta_issue]\r\nstyle 43a22312a3d4f5c995c54c5196acc50a fill:#fff4de,stroke:#cece71\r\nd2345f23e5ef9f54c591c4a687c24575[create_meta_issue]\r\n1d79010ee1550f057c531130814c40b9(body)\r\n1d79010ee1550f057c531130814c40b9 --> d2345f23e5ef9f54c591c4a687c24575\r\n712d4318e59bd2dc629f0ddebb257ca3(repo)\r\n712d4318e59bd2dc629f0ddebb257ca3 --> d2345f23e5ef9f54c591c4a687c24575\r\n38a94f1c2162803f571489d707d61021(title)\r\n38a94f1c2162803f571489d707d61021 --> d2345f23e5ef9f54c591c4a687c24575\r\n2b22b4998ac3e6a64d82e0147e71ee1b(result)\r\nd2345f23e5ef9f54c591c4a687c24575 --> 2b22b4998ac3e6a64d82e0147e71ee1b\r\nend\r\nsubgraph f77af509c413b86b6cd7e107cc623c73[meta_issue_body]\r\nstyle f77af509c413b86b6cd7e107cc623c73 fill:#fff4de,stroke:#cece71\r\n69a9852570720a3d35cb9dd52a281f71[meta_issue_body]\r\n480d1cc478d23858e92d61225349b674(base)\r\n480d1cc478d23858e92d61225349b674 --> 69a9852570720a3d35cb9dd52a281f71\r\n37035ea5a06a282bdc1e1de24090a36d(readme_issue)\r\n37035ea5a06a282bdc1e1de24090a36d --> 69a9852570720a3d35cb9dd52a281f71\r\nfdf0dbb8ca47ee9022b3daeb8c7df9c0(readme_path)\r\nfdf0dbb8ca47ee9022b3daeb8c7df9c0 --> 69a9852570720a3d35cb9dd52a281f71\r\n428ca84f627c695362652cc7531fc27b(repo)\r\n428ca84f627c695362652cc7531fc27b --> 69a9852570720a3d35cb9dd52a281f71\r\n0cd9eb1ffb3c56d2b0a4359f800b1f20(result)\r\n69a9852570720a3d35cb9dd52a281f71 --> 0cd9eb1ffb3c56d2b0a4359f800b1f20\r\nend\r\nsubgraph 8506cba6514466fb6d65f33ace4b0eac[alice_contribute_readme]\r\nstyle 8506cba6514466fb6d65f33ace4b0eac fill:#fff4de,stroke:#cece71\r\nd4507d3d1c3fbf3e7e373eae24797667[alice_contribute_readme]\r\n68cf7d6869d027ca46a5fb4dbf7001d1(repo)\r\n68cf7d6869d027ca46a5fb4dbf7001d1 --> d4507d3d1c3fbf3e7e373eae24797667\r\n2f9316539862f119f7c525bf9061e974(result)\r\nd4507d3d1c3fbf3e7e373eae24797667 --> 2f9316539862f119f7c525bf9061e974\r\nend\r\nsubgraph 4233e6dc67cba131d4ef005af9c02959[contribute_readme_md]\r\nstyle 4233e6dc67cba131d4ef005af9c02959 fill:#fff4de,stroke:#cece71\r\n3db0ee5d6ab83886bded5afd86f3f88f[contribute_readme_md]\r\n37044e4d8610abe13849bc71a5cb7591(base)\r\n37044e4d8610abe13849bc71a5cb7591 --> 3db0ee5d6ab83886bded5afd86f3f88f\r\n631c051fe6050ae8f8fc3321ed00802d(commit_message)\r\n631c051fe6050ae8f8fc3321ed00802d --> 3db0ee5d6ab83886bded5afd86f3f88f\r\n182194bab776fc9bc406ed573d621b68(repo)\r\n182194bab776fc9bc406ed573d621b68 --> 3db0ee5d6ab83886bded5afd86f3f88f\r\n0ee9f524d2db12be854fe611fa8126dd(result)\r\n3db0ee5d6ab83886bded5afd86f3f88f --> 0ee9f524d2db12be854fe611fa8126dd\r\nend\r\nsubgraph a6080d9c45eb5f806a47152a18bf7830[create_readme_file_if_not_exists]\r\nstyle a6080d9c45eb5f806a47152a18bf7830 fill:#fff4de,stroke:#cece71\r\n67e388f508dd96084c37d236a2c67e67[create_readme_file_if_not_exists]\r\n54faf20bfdca0e63d07efb3e5a984cf1(readme_contents)\r\n54faf20bfdca0e63d07efb3e5a984cf1 --> 67e388f508dd96084c37d236a2c67e67\r\n8c089c362960ccf181742334a3dccaea(repo)\r\n8c089c362960ccf181742334a3dccaea --> 67e388f508dd96084c37d236a2c67e67\r\n5cc65e17d40e6a7223c1504f1c4b0d2a(result)\r\n67e388f508dd96084c37d236a2c67e67 --> 5cc65e17d40e6a7223c1504f1c4b0d2a\r\nend\r\nsubgraph e7757158127e9845b2915c16a7fa80c5[readme_commit_message]\r\nstyle e7757158127e9845b2915c16a7fa80c5 fill:#fff4de,stroke:#cece71\r\n562bdc535c7cebfc66dba920b1a17540[readme_commit_message]\r\n0af5cbea9050874a0a3cba73bb61f892(issue_url)\r\n0af5cbea9050874a0a3cba73bb61f892 --> 562bdc535c7cebfc66dba920b1a17540\r\n2641f3b67327fb7518ee34a3a40b0755(result)\r\n562bdc535c7cebfc66dba920b1a17540 --> 2641f3b67327fb7518ee34a3a40b0755\r\nend\r\nsubgraph cf99ff6fad80e9c21266b43fd67b2f7b[readme_issue]\r\nstyle cf99ff6fad80e9c21266b43fd67b2f7b fill:#fff4de,stroke:#cece71\r\nda44417f891a945085590baafffc2bdb[readme_issue]\r\nd519830ab4e07ec391038e8581889ac3(body)\r\nd519830ab4e07ec391038e8581889ac3 --> da44417f891a945085590baafffc2bdb\r\n268852aa3fa8ab0864a32abae5a333f7(repo)\r\n268852aa3fa8ab0864a32abae5a333f7 --> da44417f891a945085590baafffc2bdb\r\n77a11dd29af309cf43ed321446c4bf01(title)\r\n77a11dd29af309cf43ed321446c4bf01 --> da44417f891a945085590baafffc2bdb\r\n1d2360c9da18fac0b6ec142df8f3fbda(result)\r\nda44417f891a945085590baafffc2bdb --> 1d2360c9da18fac0b6ec142df8f3fbda\r\nend\r\nsubgraph 7ec0442cf2d95c367912e8abee09b217[readme_pr]\r\nstyle 7ec0442cf2d95c367912e8abee09b217 fill:#fff4de,stroke:#cece71\r\nbb314dc452cde5b6af5ea94dd277ba40[readme_pr]\r\n127d77c3047facc1daa621148c5a0a1d(base)\r\n127d77c3047facc1daa621148c5a0a1d --> bb314dc452cde5b6af5ea94dd277ba40\r\ncb421e4de153cbb912f7fbe57e4ad734(body)\r\ncb421e4de153cbb912f7fbe57e4ad734 --> bb314dc452cde5b6af5ea94dd277ba40\r\ncbf7a0b88c0a41953b245303f3e9a0d3(head)\r\ncbf7a0b88c0a41953b245303f3e9a0d3 --> bb314dc452cde5b6af5ea94dd277ba40\r\ne5f9ad44448abd2469b3fd9831f3d159(origin)\r\ne5f9ad44448abd2469b3fd9831f3d159 --> bb314dc452cde5b6af5ea94dd277ba40\r\na35aee6711d240378eb57a3932537ca1(repo)\r\na35aee6711d240378eb57a3932537ca1 --> bb314dc452cde5b6af5ea94dd277ba40\r\ndfcce88a7d605d46bf17de1159fbe5ad(title)\r\ndfcce88a7d605d46bf17de1159fbe5ad --> bb314dc452cde5b6af5ea94dd277ba40\r\na210a7890a7bea8d629368e02da3d806(result)\r\nbb314dc452cde5b6af5ea94dd277ba40 --> a210a7890a7bea8d629368e02da3d806\r\nend\r\nsubgraph 227eabb1f1c5cc0bc931714a03049e27[readme_pr_body]\r\nstyle 227eabb1f1c5cc0bc931714a03049e27 fill:#fff4de,stroke:#cece71\r\n2aea976396cfe68dacd9bc7d4a3f0cba[readme_pr_body]\r\nc5dfd309617c909b852afe0b4ae4a178(readme_issue)\r\nc5dfd309617c909b852afe0b4ae4a178 --> 2aea976396cfe68dacd9bc7d4a3f0cba\r\n40ddb5b508cb5643e7c91f7abdb72b84(result)\r\n2aea976396cfe68dacd9bc7d4a3f0cba --> 40ddb5b508cb5643e7c91f7abdb72b84\r\nend\r\nsubgraph 48687c84e69b3db0acca625cbe2e6b49[readme_pr_title]\r\nstyle 48687c84e69b3db0acca625cbe2e6b49 fill:#fff4de,stroke:#cece71\r\nd8668ff93f41bc241c8c540199cd7453[readme_pr_title]\r\n3b2137dd1c61d0dac7d4e40fd6746cfb(readme_issue)\r\n3b2137dd1c61d0dac7d4e40fd6746cfb --> d8668ff93f41bc241c8c540199cd7453\r\n956e024fde513b3a449eac9ee42d6ab3(result)\r\nd8668ff93f41bc241c8c540199cd7453 --> 956e024fde513b3a449eac9ee42d6ab3\r\nend\r\nsubgraph d3ec0ac85209a7256c89d20f758f09f4[check_if_valid_git_repository_URL]\r\nstyle d3ec0ac85209a7256c89d20f758f09f4 fill:#fff4de,stroke:#cece71\r\nf577c71443f6b04596b3fe0511326c40[check_if_valid_git_repository_URL]\r\n7440e73a8e8f864097f42162b74f2762(URL)\r\n7440e73a8e8f864097f42162b74f2762 --> f577c71443f6b04596b3fe0511326c40\r\n8e39b501b41c5d0e4596318f80a03210(valid)\r\nf577c71443f6b04596b3fe0511326c40 --> 8e39b501b41c5d0e4596318f80a03210\r\nend\r\nsubgraph af8da22d1318d911f29b95e687f87c5d[clone_git_repo]\r\nstyle af8da22d1318d911f29b95e687f87c5d fill:#fff4de,stroke:#cece71\r\n155b8fdb5524f6bfd5adbae4940ad8d5[clone_git_repo]\r\need77b9eea541e0c378c67395351099c(URL)\r\need77b9eea541e0c378c67395351099c --> 155b8fdb5524f6bfd5adbae4940ad8d5\r\n8b5928cd265dd2c44d67d076f60c8b05(ssh_key)\r\n8b5928cd265dd2c44d67d076f60c8b05 --> 155b8fdb5524f6bfd5adbae4940ad8d5\r\n4e1d5ea96e050e46ebf95ebc0713d54c(repo)\r\n155b8fdb5524f6bfd5adbae4940ad8d5 --> 4e1d5ea96e050e46ebf95ebc0713d54c\r\n6a44de06a4a3518b939b27c790f6cdce{valid_git_repository_URL}\r\n6a44de06a4a3518b939b27c790f6cdce --> 155b8fdb5524f6bfd5adbae4940ad8d5\r\nend\r\nsubgraph d3d91578caf34c0ae944b17853783406[git_repo_default_branch]\r\nstyle d3d91578caf34c0ae944b17853783406 fill:#fff4de,stroke:#cece71\r\n546062a96122df465d2631f31df4e9e3[git_repo_default_branch]\r\n181f1b33df4d795fbad2911ec7087e86(repo)\r\n181f1b33df4d795fbad2911ec7087e86 --> 546062a96122df465d2631f31df4e9e3\r\n57651c1bcd24b794dfc8d1794ab556d5(branch)\r\n546062a96122df465d2631f31df4e9e3 --> 57651c1bcd24b794dfc8d1794ab556d5\r\n5ed1ab77e726d7efdcc41e9e2f8039c6(remote)\r\n546062a96122df465d2631f31df4e9e3 --> 5ed1ab77e726d7efdcc41e9e2f8039c6\r\n4c3cdd5f15b7a846d291aac089e8a622{no_git_branch_given}\r\n4c3cdd5f15b7a846d291aac089e8a622 --> 546062a96122df465d2631f31df4e9e3\r\nend\r\nend\r\nsubgraph a4827add25f5c7d5895c5728b74e2beb[Cleanup Stage]\r\nstyle a4827add25f5c7d5895c5728b74e2beb fill:#afd388b5,stroke:#a4ca7a\r\nend\r\nsubgraph 58ca4d24d2767176f196436c2890b926[Output Stage]\r\nstyle 58ca4d24d2767176f196436c2890b926 fill:#afd388b5,stroke:#a4ca7a\r\nend\r\nsubgraph inputs[Inputs]\r\nstyle inputs fill:#f6dbf9,stroke:#a178ca\r\n128516cfa09b0383023eab52ee24878a(seed<br>dffml.util.cli.CMD)\r\n128516cfa09b0383023eab52ee24878a --> e07552ee3b6b7696cb3ddd786222eaad\r\nba29b52e9c5aa88ea1caeeff29bfd491 --> cee6b5fdd0b6fbd0539cdcdc7f5a3324\r\n128516cfa09b0383023eab52ee24878a(seed<br>dffml.util.cli.CMD)\r\n128516cfa09b0383023eab52ee24878a --> 330f463830aa97e88917d5a9d1c21500\r\n128516cfa09b0383023eab52ee24878a(seed<br>dffml.util.cli.CMD)\r\n128516cfa09b0383023eab52ee24878a --> dc7c5f0836f7d2564c402bf956722672\r\nba29b52e9c5aa88ea1caeeff29bfd491 --> 58d8518cb0d6ef6ad35dc242486f1beb\r\n79e1ea6822bff603a835fb8ee80c7ff3 --> e824ae072860bc545fc7d55aa0bca479\r\n135ee61e3402d6fcbd7a219b0b4ccd73 --> e824ae072860bc545fc7d55aa0bca479\r\n40109d487bb9f08608d8c5f6e747042f --> 33d806f9b732bfd6b96ae2e9e4243a68\r\n21ccfd2c550bd853d28581f0b0c9f9fe(seed<br>default.branch.name)\r\n21ccfd2c550bd853d28581f0b0c9f9fe --> fdcb9b6113856222e30e093f7c38065e\r\ndd5aab190ce844673819298c5b8fde76 --> bdcf4b078985f4a390e4ed4beacffa65\r\n9b92d5a346885079a2821c4d27cb5174 --> bdcf4b078985f4a390e4ed4beacffa65\r\n5a5493ab86ab4053f1d44302e7bdddd6 --> ff47cf65b58262acec28507f4427de45\r\n57651c1bcd24b794dfc8d1794ab556d5 --> ff47cf65b58262acec28507f4427de45\r\n4e1d5ea96e050e46ebf95ebc0713d54c --> e58180baf478fe910359358a3fa02234\r\n40109d487bb9f08608d8c5f6e747042f --> c3bfe79b396a98ce2d9bfe772c9c20af\r\n2a1c620b0d510c3d8ed35deda41851c5 --> 4934c6211334318c63a5e91530171c9b\r\n2a1c620b0d510c3d8ed35deda41851c5 --> 5567dd8a6d7ae4fe86252db32e189a4d\r\n5ed1ab77e726d7efdcc41e9e2f8039c6 --> 6c2b36393ffff6be0b4ad333df2d9419\r\ndd5aab190ce844673819298c5b8fde76 --> 19a9ee483c1743e6ecf0a2dc3b6f8c7a\r\n9b92d5a346885079a2821c4d27cb5174 --> 19a9ee483c1743e6ecf0a2dc3b6f8c7a\r\n0cd9eb1ffb3c56d2b0a4359f800b1f20 --> 1d79010ee1550f057c531130814c40b9\r\ndd5aab190ce844673819298c5b8fde76 --> 712d4318e59bd2dc629f0ddebb257ca3\r\n9b92d5a346885079a2821c4d27cb5174 --> 712d4318e59bd2dc629f0ddebb257ca3\r\ne7ad3469d98c3bd160363dbc47e2d741(seed<br>MetaIssueTitle)\r\ne7ad3469d98c3bd160363dbc47e2d741 --> 38a94f1c2162803f571489d707d61021\r\n150204cd2d5a921deb53c312418379a1 --> 480d1cc478d23858e92d61225349b674\r\n1d2360c9da18fac0b6ec142df8f3fbda --> 37035ea5a06a282bdc1e1de24090a36d\r\n5cc65e17d40e6a7223c1504f1c4b0d2a --> fdf0dbb8ca47ee9022b3daeb8c7df9c0\r\ndd5aab190ce844673819298c5b8fde76 --> 428ca84f627c695362652cc7531fc27b\r\n9b92d5a346885079a2821c4d27cb5174 --> 428ca84f627c695362652cc7531fc27b\r\ndd5aab190ce844673819298c5b8fde76 --> 68cf7d6869d027ca46a5fb4dbf7001d1\r\n9b92d5a346885079a2821c4d27cb5174 --> 68cf7d6869d027ca46a5fb4dbf7001d1\r\n150204cd2d5a921deb53c312418379a1 --> 37044e4d8610abe13849bc71a5cb7591\r\n2641f3b67327fb7518ee34a3a40b0755 --> 631c051fe6050ae8f8fc3321ed00802d\r\n2f9316539862f119f7c525bf9061e974 --> 182194bab776fc9bc406ed573d621b68\r\nd2708225c1f4c95d613a2645a17a5bc0(seed<br>repo.directory.readme.contents)\r\nd2708225c1f4c95d613a2645a17a5bc0 --> 54faf20bfdca0e63d07efb3e5a984cf1\r\n2f9316539862f119f7c525bf9061e974 --> 8c089c362960ccf181742334a3dccaea\r\n1d2360c9da18fac0b6ec142df8f3fbda --> 0af5cbea9050874a0a3cba73bb61f892\r\n1daacccd02f8117e67ad3cb8686a732c(seed<br>ReadmeIssueBody)\r\n1daacccd02f8117e67ad3cb8686a732c --> d519830ab4e07ec391038e8581889ac3\r\n2f9316539862f119f7c525bf9061e974 --> 268852aa3fa8ab0864a32abae5a333f7\r\n0c1ab2d4bda10e1083557833ae5c5da4(seed<br>ReadmeIssueTitle)\r\n0c1ab2d4bda10e1083557833ae5c5da4 --> 77a11dd29af309cf43ed321446c4bf01\r\n150204cd2d5a921deb53c312418379a1 --> 127d77c3047facc1daa621148c5a0a1d\r\n40ddb5b508cb5643e7c91f7abdb72b84 --> cb421e4de153cbb912f7fbe57e4ad734\r\n0ee9f524d2db12be854fe611fa8126dd --> cbf7a0b88c0a41953b245303f3e9a0d3\r\nb4cff8d194413f436d94f9d84ece0262 --> e5f9ad44448abd2469b3fd9831f3d159\r\n2f9316539862f119f7c525bf9061e974 --> a35aee6711d240378eb57a3932537ca1\r\n956e024fde513b3a449eac9ee42d6ab3 --> dfcce88a7d605d46bf17de1159fbe5ad\r\n1d2360c9da18fac0b6ec142df8f3fbda --> c5dfd309617c909b852afe0b4ae4a178\r\n1d2360c9da18fac0b6ec142df8f3fbda --> 3b2137dd1c61d0dac7d4e40fd6746cfb\r\n8d0adc31da1a0919724baf73d047743c --> 7440e73a8e8f864097f42162b74f2762\r\n8d0adc31da1a0919724baf73d047743c --> eed77b9eea541e0c378c67395351099c\r\na6ed501edbf561fda49a0a0a3ca310f0(seed<br>git_repo_ssh_key)\r\na6ed501edbf561fda49a0a0a3ca310f0 --> 8b5928cd265dd2c44d67d076f60c8b05\r\n8e39b501b41c5d0e4596318f80a03210 --> 6a44de06a4a3518b939b27c790f6cdce\r\n4e1d5ea96e050e46ebf95ebc0713d54c --> 181f1b33df4d795fbad2911ec7087e86\r\nend\r\n```",
"editedAt": "2022-07-28T21:32:12Z"
},
{
"diff": "## 2022-07-28 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n- Noticed that we have an issue with adding new files and locking. The current\r\n lock is on the `git_repository/GitRepoSpec`.\r\n - We then convert to `AliceGitRepo`, at which point anything take `AliceGitRepo`\r\n - alice: please: contribute: recommended community standards: Refactoring into overlays associated with each file contributed\r\n - Completed in 1a71dbe3ab3743430ce2783f4210a6cd807c36a1\r\n\r\n### 43\r\n\r\n```\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed.append(dffml.Input(value=repo, definition=definition, origin=('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result')))\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed\r\n[Input(value=origin, definition=writable.github.remote.origin), Input(value=master, definition=repo.git.base.branch), Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-hxnacg5_', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]\r\n```\r\n\r\n- Attempting to figure out why an operation is not being called\r\n - `contribute_readme_md` should be getting `base`, but is not.\r\n\r\n```\r\n{'_': {ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]},\r\n 'alternate_definitions': [],\r\n 'by_origin': {('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]},\r\n 'check_for_default_value': [repo.git.base.branch],\r\n 'contexts': [MemoryInputNetworkContextEntry(ctx=Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo), definitions={ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)], repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)], ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]}, by_origin={('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]})],\r\n 'ctx': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'dataflow': <dffml.df.types.DataFlow object at 0x7f177ec3b7f0>,\r\n 'definition': repo.git.base.branch,\r\n 'gather': {'base': [],\r\n 'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]},\r\n 'handle_string': \"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', \"\r\n \"URL='https://github.com/pdxjohnny/testaaaa'), \"\r\n 'definition=ReadmeGitRepo)',\r\n 'input_flow': InputFlow(inputs={'repo': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme': 'result'}], 'base': ['seed'], 'commit_message': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message': 'result'}]}, conditions=[]),\r\n 'input_name': 'base',\r\n 'input_source': 'seed',\r\n 'input_sources': ['seed'],\r\n 'item': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'operation': Operation(name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', inputs={'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}, outputs={'result': repo.readme.git.branch}, stage=<Stage.PROCESSING: 'processing'>, conditions=[], expand=[], instance_name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', validator=False, retry=0),\r\n 'origin': 'seed',\r\n 'origins': ['seed'],\r\n 'pprint': <function pprint at 0x7f1782d94670>,\r\n 'rctx': <dffml.df.memory.MemoryRedundancyCheckerContext object at 0x7f177ec5e220>,\r\n 'self': <dffml.df.memory.MemoryInputNetworkContext object at 0x7f177ec3ba30>}\r\n> /home/pdxjohnny/Documents/python/dffml/dffml/df/memory.py(788)gather_inputs()\r\n-> return\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n(Pdb) self.ctxhd.keys()\r\ndict_keys([\"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)\"])\r\n(Pdb) from pprint import pprint\r\n(Pdb) pprint(inputs.definitions)\r\n{ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]}\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n```\r\n\r\n- Suspect discarded because of mismatched origin, if not that, will check definition\r\n - Found out that it was seed vs. output origin mismatch\r\n - Found out that BaseBranch comes from OverlayGit\r\n - Registered OverlayGit as an overlay of OverlayReadme to that it's definitions get loaded\r\n - This way `auto_flow` will make the expected origin the output from OverlayGit operations\r\n rather than seed (the default when no matching outputs are seen on DataFlow init).\r\n - We found it created an infinite loop\r\n - Will try reusing redundancy checker, that seems to be doing well\r\n- https://github.com/intel/dffml/issues/1408\r\n- Now debugging why `readme_pr` not called, OverlayGit definitions were seen earlier\r\n on subflow start to be present, must be something else.\r\n - The logs tell us that alice_contribute_readme is returning `None`, which means\r\n that the downstream operation is not called, since None means no return value\r\n in this case.\r\n\r\n```\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme Outputs: None\r\n```\r\n\r\n- Future\r\n - `run_custom` Optionally support forward subflow\r\n- TODO\r\n - [ ] Set definition proprety `AliceGitRepo.lock` to `True`\r\n\r\n\r\n### 44\r\n\r\n- Found out that util: subprocess: run command events: Do not return after yield of stdout/err\r\n - Fixed in b6eea6ed4549f9e7a89aab6306a51213b2bf36c9\r\n\r\n```\r\n(Pdb)\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md Outputs: {'result': 'alice-contribute-recommended-community-standards-readme'}\r\n(Pdb) pprint(readme_dataflow.flow['alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr'].inputs)\r\n{'base': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch': 'result'}],\r\n 'body': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body': 'result'}],\r\n 'head': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md': 'result'}],\r\n 'origin': ['seed'],\r\n 'repo': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme': 'result'}],\r\n 'title': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_title': 'result'}]}\r\n```\r\n\r\n- origin is set to seed!",
"editedAt": "2022-07-28T21:20:13Z"
},
{
"diff": "## 2022-07-28 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n- Noticed that we have an issue with adding new files and locking. The current\r\n lock is on the `git_repository/GitRepoSpec`.\r\n - We then convert to `AliceGitRepo`, at which point anything take `AliceGitRepo`\r\n - alice: please: contribute: recommended community standards: Refactoring into overlays associated with each file contributed\r\n - Completed in 1a71dbe3ab3743430ce2783f4210a6cd807c36a1\r\n\r\n### 43\r\n\r\n```\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed.append(dffml.Input(value=repo, definition=definition, origin=('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result')))\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed\r\n[Input(value=origin, definition=writable.github.remote.origin), Input(value=master, definition=repo.git.base.branch), Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-hxnacg5_', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]\r\n```\r\n\r\n- Attempting to figure out why an operation is not being called\r\n - `contribute_readme_md` should be getting `base`, but is not.\r\n\r\n```\r\n{'_': {ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]},\r\n 'alternate_definitions': [],\r\n 'by_origin': {('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]},\r\n 'check_for_default_value': [repo.git.base.branch],\r\n 'contexts': [MemoryInputNetworkContextEntry(ctx=Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo), definitions={ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)], repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)], ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]}, by_origin={('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]})],\r\n 'ctx': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'dataflow': <dffml.df.types.DataFlow object at 0x7f177ec3b7f0>,\r\n 'definition': repo.git.base.branch,\r\n 'gather': {'base': [],\r\n 'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]},\r\n 'handle_string': \"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', \"\r\n \"URL='https://github.com/pdxjohnny/testaaaa'), \"\r\n 'definition=ReadmeGitRepo)',\r\n 'input_flow': InputFlow(inputs={'repo': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme': 'result'}], 'base': ['seed'], 'commit_message': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message': 'result'}]}, conditions=[]),\r\n 'input_name': 'base',\r\n 'input_source': 'seed',\r\n 'input_sources': ['seed'],\r\n 'item': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'operation': Operation(name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', inputs={'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}, outputs={'result': repo.readme.git.branch}, stage=<Stage.PROCESSING: 'processing'>, conditions=[], expand=[], instance_name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', validator=False, retry=0),\r\n 'origin': 'seed',\r\n 'origins': ['seed'],\r\n 'pprint': <function pprint at 0x7f1782d94670>,\r\n 'rctx': <dffml.df.memory.MemoryRedundancyCheckerContext object at 0x7f177ec5e220>,\r\n 'self': <dffml.df.memory.MemoryInputNetworkContext object at 0x7f177ec3ba30>}\r\n> /home/pdxjohnny/Documents/python/dffml/dffml/df/memory.py(788)gather_inputs()\r\n-> return\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n(Pdb) self.ctxhd.keys()\r\ndict_keys([\"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)\"])\r\n(Pdb) from pprint import pprint\r\n(Pdb) pprint(inputs.definitions)\r\n{ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]}\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n```\r\n\r\n- Suspect discarded because of mismatched origin, if not that, will check definition\r\n - Found out that it was seed vs. output origin mismatch\r\n - Found out that BaseBranch comes from OverlayGit\r\n - Registered OverlayGit as an overlay of OverlayReadme to that it's definitions get loaded\r\n - This way `auto_flow` will make the expected origin the output from OverlayGit operations\r\n rather than seed (the default when no matching outputs are seen on DataFlow init).\r\n - We found it created an infinite loop\r\n - Will try reusing redundancy checker, that seems to be doing well\r\n- https://github.com/intel/dffml/issues/1408\r\n- Now debugging why `readme_pr` not called, OverlayGit definitions were seen earlier\r\n on subflow start to be present, must be something else.\r\n - The logs tell us that alice_contribute_readme is returning `None`, which means\r\n that the downstream operation is not called, since None means no return value\r\n in this case.\r\n\r\n```\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme Outputs: None\r\n```\r\n\r\n- Future\r\n - `run_custom` Optionally support forward subflow\r\n- TODO\r\n - [ ] Set definition proprety `AliceGitRepo.lock` to `True`\r\n\r\n\r\n### 44\r\n\r\n- Found out that util: subprocess: run command events: Do not return after yield of stdout/err\r\n - Fixed in b6eea6ed4549f9e7a89aab6306a51213b2bf36c9\r\n\r\n```\r\n(Pdb)\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md Outputs: {'result': 'alice-contribute-recommended-community-standards-readme'}\r\n(Pdb) pprint(readme_dataflow.flow['alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr'].inputs)\r\n{'base': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch': 'result'}],\r\n 'body': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body': 'result'}],\r\n 'head': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md': 'result'}],\r\n 'origin': ['seed'],\r\n 'repo': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme': 'result'}],\r\n 'title': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_title': 'result'}]}\r\n```",
"editedAt": "2022-07-28T21:19:38Z"
},
{
"diff": "## 2022-07-28 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n- Noticed that we have an issue with adding new files and locking. The current\r\n lock is on the `git_repository/GitRepoSpec`.\r\n - We then convert to `AliceGitRepo`, at which point anything take `AliceGitRepo`\r\n - alice: please: contribute: recommended community standards: Refactoring into overlays associated with each file contributed\r\n - Completed in 1a71dbe3ab3743430ce2783f4210a6cd807c36a1\r\n\r\n### 43\r\n\r\n```\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed.append(dffml.Input(value=repo, definition=definition, origin=('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result')))\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed\r\n[Input(value=origin, definition=writable.github.remote.origin), Input(value=master, definition=repo.git.base.branch), Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-hxnacg5_', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]\r\n```\r\n\r\n- Attempting to figure out why an operation is not being called\r\n - `contribute_readme_md` should be getting `base`, but is not.\r\n\r\n```\r\n{'_': {ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]},\r\n 'alternate_definitions': [],\r\n 'by_origin': {('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]},\r\n 'check_for_default_value': [repo.git.base.branch],\r\n 'contexts': [MemoryInputNetworkContextEntry(ctx=Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo), definitions={ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)], repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)], ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]}, by_origin={('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]})],\r\n 'ctx': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'dataflow': <dffml.df.types.DataFlow object at 0x7f177ec3b7f0>,\r\n 'definition': repo.git.base.branch,\r\n 'gather': {'base': [],\r\n 'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]},\r\n 'handle_string': \"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', \"\r\n \"URL='https://github.com/pdxjohnny/testaaaa'), \"\r\n 'definition=ReadmeGitRepo)',\r\n 'input_flow': InputFlow(inputs={'repo': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme': 'result'}], 'base': ['seed'], 'commit_message': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message': 'result'}]}, conditions=[]),\r\n 'input_name': 'base',\r\n 'input_source': 'seed',\r\n 'input_sources': ['seed'],\r\n 'item': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'operation': Operation(name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', inputs={'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}, outputs={'result': repo.readme.git.branch}, stage=<Stage.PROCESSING: 'processing'>, conditions=[], expand=[], instance_name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', validator=False, retry=0),\r\n 'origin': 'seed',\r\n 'origins': ['seed'],\r\n 'pprint': <function pprint at 0x7f1782d94670>,\r\n 'rctx': <dffml.df.memory.MemoryRedundancyCheckerContext object at 0x7f177ec5e220>,\r\n 'self': <dffml.df.memory.MemoryInputNetworkContext object at 0x7f177ec3ba30>}\r\n> /home/pdxjohnny/Documents/python/dffml/dffml/df/memory.py(788)gather_inputs()\r\n-> return\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n(Pdb) self.ctxhd.keys()\r\ndict_keys([\"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)\"])\r\n(Pdb) from pprint import pprint\r\n(Pdb) pprint(inputs.definitions)\r\n{ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]}\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n```\r\n\r\n- Suspect discarded because of mismatched origin, if not that, will check definition\r\n - Found out that it was seed vs. output origin mismatch\r\n - Found out that BaseBranch comes from OverlayGit\r\n - Registered OverlayGit as an overlay of OverlayReadme to that it's definitions get loaded\r\n - This way `auto_flow` will make the expected origin the output from OverlayGit operations\r\n rather than seed (the default when no matching outputs are seen on DataFlow init).\r\n - We found it created an infinite loop\r\n - Will try reusing redundancy checker, that seems to be doing well\r\n- https://github.com/intel/dffml/issues/1408\r\n- Now debugging why `readme_pr` not called, OverlayGit definitions were seen earlier\r\n on subflow start to be present, must be something else.\r\n - The logs tell us that alice_contribute_readme is returning `None`, which means\r\n that the downstream operation is not called, since None means no return value\r\n in this case.\r\n\r\n```\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme Outputs: None\r\n```\r\n\r\n- Future\r\n - `run_custom` Optionally support forward subflow\r\n- TODO\r\n - [ ] Set definition proprety `AliceGitRepo.lock` to `True`\r\n\r\n\r\n### 44\r\n\r\n- Found out that util: subprocess: run command events: Do not return after yield of stdout/err\r\n - Fixed in b6eea6ed4549f9e7a89aab6306a51213b2bf36c9\r\n\r\n```\r\n(Pdb)\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md Outputs: {'result': 'alice-contribute-recommended-community-standards-readme'}\r\n```",
"editedAt": "2022-07-28T21:11:00Z"
},
{
"diff": "## 2022-07-28 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n- Noticed that we have an issue with adding new files and locking. The current\r\n lock is on the `git_repository/GitRepoSpec`.\r\n - We then convert to `AliceGitRepo`, at which point anything take `AliceGitRepo`\r\n - alice: please: contribute: recommended community standards: Refactoring into overlays associated with each file contributed\r\n - Completed in 1a71dbe3ab3743430ce2783f4210a6cd807c36a1\r\n\r\n### 43\r\n\r\n```\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed.append(dffml.Input(value=repo, definition=definition, origin=('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result')))\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed\r\n[Input(value=origin, definition=writable.github.remote.origin), Input(value=master, definition=repo.git.base.branch), Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-hxnacg5_', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]\r\n```\r\n\r\n- Attempting to figure out why an operation is not being called\r\n - `contribute_readme_md` should be getting `base`, but is not.\r\n\r\n```\r\n{'_': {ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]},\r\n 'alternate_definitions': [],\r\n 'by_origin': {('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]},\r\n 'check_for_default_value': [repo.git.base.branch],\r\n 'contexts': [MemoryInputNetworkContextEntry(ctx=Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo), definitions={ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)], repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)], ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]}, by_origin={('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]})],\r\n 'ctx': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'dataflow': <dffml.df.types.DataFlow object at 0x7f177ec3b7f0>,\r\n 'definition': repo.git.base.branch,\r\n 'gather': {'base': [],\r\n 'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]},\r\n 'handle_string': \"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', \"\r\n \"URL='https://github.com/pdxjohnny/testaaaa'), \"\r\n 'definition=ReadmeGitRepo)',\r\n 'input_flow': InputFlow(inputs={'repo': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme': 'result'}], 'base': ['seed'], 'commit_message': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message': 'result'}]}, conditions=[]),\r\n 'input_name': 'base',\r\n 'input_source': 'seed',\r\n 'input_sources': ['seed'],\r\n 'item': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'operation': Operation(name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', inputs={'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}, outputs={'result': repo.readme.git.branch}, stage=<Stage.PROCESSING: 'processing'>, conditions=[], expand=[], instance_name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', validator=False, retry=0),\r\n 'origin': 'seed',\r\n 'origins': ['seed'],\r\n 'pprint': <function pprint at 0x7f1782d94670>,\r\n 'rctx': <dffml.df.memory.MemoryRedundancyCheckerContext object at 0x7f177ec5e220>,\r\n 'self': <dffml.df.memory.MemoryInputNetworkContext object at 0x7f177ec3ba30>}\r\n> /home/pdxjohnny/Documents/python/dffml/dffml/df/memory.py(788)gather_inputs()\r\n-> return\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n(Pdb) self.ctxhd.keys()\r\ndict_keys([\"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)\"])\r\n(Pdb) from pprint import pprint\r\n(Pdb) pprint(inputs.definitions)\r\n{ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]}\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n```\r\n\r\n- Suspect discarded because of mismatched origin, if not that, will check definition\r\n - Found out that it was seed vs. output origin mismatch\r\n - Found out that BaseBranch comes from OverlayGit\r\n - Registered OverlayGit as an overlay of OverlayReadme to that it's definitions get loaded\r\n - This way `auto_flow` will make the expected origin the output from OverlayGit operations\r\n rather than seed (the default when no matching outputs are seen on DataFlow init).\r\n - We found it created an infinite loop\r\n - Will try reusing redundancy checker, that seems to be doing well\r\n- https://github.com/intel/dffml/issues/1408\r\n- Now debugging why `readme_pr` not called, OverlayGit definitions were seen earlier\r\n on subflow start to be present, must be something else.\r\n - The logs tell us that alice_contribute_readme is returning `None`, which means\r\n that the downstream operation is not called, since None means no return value\r\n in this case.\r\n\r\n```\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme Outputs: None\r\n```\r\n\r\n- Future\r\n - `run_custom` Optionally support forward subflow\r\n- TODO\r\n - [ ] Set definition proprety `AliceGitRepo.lock` to `True`\r\n\r\n\r\n### 44\r\n\r\n- Found out that util: subprocess: run command events: Do not return after yield of stdout/err\r\n - Fixed in b6eea6ed4549f9e7a89aab6306a51213b2bf36c9",
"editedAt": "2022-07-28T21:07:27Z"
},
{
"diff": "## 2022-07-28 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n- Noticed that we have an issue with adding new files and locking. The current\r\n lock is on the `git_repository/GitRepoSpec`.\r\n - We then convert to `AliceGitRepo`, at which point anything take `AliceGitRepo`\r\n - alice: please: contribute: recommended community standards: Refactoring into overlays associated with each file contributed\r\n - Completed in 1a71dbe3ab3743430ce2783f4210a6cd807c36a1\r\n\r\n```\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed.append(dffml.Input(value=repo, definition=definition, origin=('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result')))\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed\r\n[Input(value=origin, definition=writable.github.remote.origin), Input(value=master, definition=repo.git.base.branch), Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-hxnacg5_', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]\r\n```\r\n\r\n- Attempting to figure out why an operation is not being called\r\n - `contribute_readme_md` should be getting `base`, but is not.\r\n\r\n```\r\n{'_': {ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]},\r\n 'alternate_definitions': [],\r\n 'by_origin': {('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]},\r\n 'check_for_default_value': [repo.git.base.branch],\r\n 'contexts': [MemoryInputNetworkContextEntry(ctx=Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo), definitions={ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)], repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)], ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]}, by_origin={('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]})],\r\n 'ctx': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'dataflow': <dffml.df.types.DataFlow object at 0x7f177ec3b7f0>,\r\n 'definition': repo.git.base.branch,\r\n 'gather': {'base': [],\r\n 'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]},\r\n 'handle_string': \"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', \"\r\n \"URL='https://github.com/pdxjohnny/testaaaa'), \"\r\n 'definition=ReadmeGitRepo)',\r\n 'input_flow': InputFlow(inputs={'repo': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme': 'result'}], 'base': ['seed'], 'commit_message': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message': 'result'}]}, conditions=[]),\r\n 'input_name': 'base',\r\n 'input_source': 'seed',\r\n 'input_sources': ['seed'],\r\n 'item': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'operation': Operation(name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', inputs={'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}, outputs={'result': repo.readme.git.branch}, stage=<Stage.PROCESSING: 'processing'>, conditions=[], expand=[], instance_name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', validator=False, retry=0),\r\n 'origin': 'seed',\r\n 'origins': ['seed'],\r\n 'pprint': <function pprint at 0x7f1782d94670>,\r\n 'rctx': <dffml.df.memory.MemoryRedundancyCheckerContext object at 0x7f177ec5e220>,\r\n 'self': <dffml.df.memory.MemoryInputNetworkContext object at 0x7f177ec3ba30>}\r\n> /home/pdxjohnny/Documents/python/dffml/dffml/df/memory.py(788)gather_inputs()\r\n-> return\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n(Pdb) self.ctxhd.keys()\r\ndict_keys([\"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)\"])\r\n(Pdb) from pprint import pprint\r\n(Pdb) pprint(inputs.definitions)\r\n{ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]}\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n```\r\n\r\n- Suspect discarded because of mismatched origin, if not that, will check definition\r\n - Found out that it was seed vs. output origin mismatch\r\n - Found out that BaseBranch comes from OverlayGit\r\n - Registered OverlayGit as an overlay of OverlayReadme to that it's definitions get loaded\r\n - This way `auto_flow` will make the expected origin the output from OverlayGit operations\r\n rather than seed (the default when no matching outputs are seen on DataFlow init).\r\n - We found it created an infinite loop\r\n - Will try reusing redundancy checker, that seems to be doing well\r\n- https://github.com/intel/dffml/issues/1408\r\n- Now debugging why `readme_pr` not called, OverlayGit definitions were seen earlier\r\n on subflow start to be present, must be something else.\r\n - The logs tell us that alice_contribute_readme is returning `None`, which means\r\n that the downstream operation is not called, since None means no return value\r\n in this case.\r\n\r\n```\r\nDEBUG:dffml.MemoryOperationImplementationNetworkContext:alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme Outputs: None\r\n```\r\n\r\n- Future\r\n - `run_custom` Optionally support forward subflow\r\n- TODO\r\n - [ ] Set definition proprety `AliceGitRepo.lock` to `True`",
"editedAt": "2022-07-28T20:18:06Z"
},
{
"diff": "## 2022-07-28 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n- Noticed that we have an issue with adding new files and locking. The current\r\n lock is on the `git_repository/GitRepoSpec`.\r\n - We then convert to `AliceGitRepo`, at which point anything take `AliceGitRepo`\r\n - alice: please: contribute: recommended community standards: Refactoring into overlays associated with each file contributed\r\n - Completed in 1a71dbe3ab3743430ce2783f4210a6cd807c36a1\r\n\r\n```\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed.append(dffml.Input(value=repo, definition=definition, origin=('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result')))\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed\r\n[Input(value=origin, definition=writable.github.remote.origin), Input(value=master, definition=repo.git.base.branch), Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-hxnacg5_', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]\r\n```\r\n\r\n- Attempting to figure out why an operation is not being called\r\n - `contribute_readme_md` should be getting `base`, but is not.\r\n\r\n```\r\n{'_': {ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]},\r\n 'alternate_definitions': [],\r\n 'by_origin': {('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]},\r\n 'check_for_default_value': [repo.git.base.branch],\r\n 'contexts': [MemoryInputNetworkContextEntry(ctx=Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo), definitions={ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)], repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)], ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]}, by_origin={('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]})],\r\n 'ctx': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'dataflow': <dffml.df.types.DataFlow object at 0x7f177ec3b7f0>,\r\n 'definition': repo.git.base.branch,\r\n 'gather': {'base': [],\r\n 'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]},\r\n 'handle_string': \"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', \"\r\n \"URL='https://github.com/pdxjohnny/testaaaa'), \"\r\n 'definition=ReadmeGitRepo)',\r\n 'input_flow': InputFlow(inputs={'repo': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme': 'result'}], 'base': ['seed'], 'commit_message': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message': 'result'}]}, conditions=[]),\r\n 'input_name': 'base',\r\n 'input_source': 'seed',\r\n 'input_sources': ['seed'],\r\n 'item': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'operation': Operation(name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', inputs={'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}, outputs={'result': repo.readme.git.branch}, stage=<Stage.PROCESSING: 'processing'>, conditions=[], expand=[], instance_name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', validator=False, retry=0),\r\n 'origin': 'seed',\r\n 'origins': ['seed'],\r\n 'pprint': <function pprint at 0x7f1782d94670>,\r\n 'rctx': <dffml.df.memory.MemoryRedundancyCheckerContext object at 0x7f177ec5e220>,\r\n 'self': <dffml.df.memory.MemoryInputNetworkContext object at 0x7f177ec3ba30>}\r\n> /home/pdxjohnny/Documents/python/dffml/dffml/df/memory.py(788)gather_inputs()\r\n-> return\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n(Pdb) self.ctxhd.keys()\r\ndict_keys([\"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)\"])\r\n(Pdb) from pprint import pprint\r\n(Pdb) pprint(inputs.definitions)\r\n{ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]}\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n```\r\n\r\n- Suspect discarded because of mismatched origin, if not that, will check definition\r\n - Found out that it was seed vs. output origin mismatch\r\n - Found out that BaseBranch comes from OverlayGit\r\n - Registered OverlayGit as an overlay of OverlayReadme to that it's definitions get loaded\r\n - This way `auto_flow` will make the expected origin the output from OverlayGit operations\r\n rather than seed (the default when no matching outputs are seen on DataFlow init).\r\n - We found it created an infinite loop\r\n - Will try reusing redundancy checker, that seems to be doing well\r\n- https://github.com/intel/dffml/issues/1408\r\n- Future\r\n - `run_custom` Optionally support forward subflow\r\n- TODO\r\n - [ ] Set definition proprety `AliceGitRepo.lock` to `True`",
"editedAt": "2022-07-28T20:13:57Z"
},
{
"diff": "## 2022-07-28 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n- Noticed that we have an issue with adding new files and locking. The current\r\n lock is on the `git_repository/GitRepoSpec`.\r\n - We then convert to `AliceGitRepo`, at which point anything take `AliceGitRepo`\r\n - alice: please: contribute: recommended community standards: Refactoring into overlays associated with each file contributed\r\n - Completed in 1a71dbe3ab3743430ce2783f4210a6cd807c36a1\r\n\r\n```\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed.append(dffml.Input(value=repo, definition=definition, origin=('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result')))\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed\r\n[Input(value=origin, definition=writable.github.remote.origin), Input(value=master, definition=repo.git.base.branch), Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-hxnacg5_', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]\r\n```\r\n\r\n- Attempting to figure out why an operation is not being called\r\n - `contribute_readme_md` should be getting `base`, but is not.\r\n\r\n```\r\n{'_': {ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]},\r\n 'alternate_definitions': [],\r\n 'by_origin': {('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]},\r\n 'check_for_default_value': [repo.git.base.branch],\r\n 'contexts': [MemoryInputNetworkContextEntry(ctx=Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo), definitions={ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)], repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)], ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]}, by_origin={('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]})],\r\n 'ctx': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'dataflow': <dffml.df.types.DataFlow object at 0x7f177ec3b7f0>,\r\n 'definition': repo.git.base.branch,\r\n 'gather': {'base': [],\r\n 'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]},\r\n 'handle_string': \"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', \"\r\n \"URL='https://github.com/pdxjohnny/testaaaa'), \"\r\n 'definition=ReadmeGitRepo)',\r\n 'input_flow': InputFlow(inputs={'repo': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme': 'result'}], 'base': ['seed'], 'commit_message': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message': 'result'}]}, conditions=[]),\r\n 'input_name': 'base',\r\n 'input_source': 'seed',\r\n 'input_sources': ['seed'],\r\n 'item': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'operation': Operation(name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', inputs={'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}, outputs={'result': repo.readme.git.branch}, stage=<Stage.PROCESSING: 'processing'>, conditions=[], expand=[], instance_name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', validator=False, retry=0),\r\n 'origin': 'seed',\r\n 'origins': ['seed'],\r\n 'pprint': <function pprint at 0x7f1782d94670>,\r\n 'rctx': <dffml.df.memory.MemoryRedundancyCheckerContext object at 0x7f177ec5e220>,\r\n 'self': <dffml.df.memory.MemoryInputNetworkContext object at 0x7f177ec3ba30>}\r\n> /home/pdxjohnny/Documents/python/dffml/dffml/df/memory.py(788)gather_inputs()\r\n-> return\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n(Pdb) self.ctxhd.keys()\r\ndict_keys([\"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)\"])\r\n(Pdb) from pprint import pprint\r\n(Pdb) pprint(inputs.definitions)\r\n{ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]}\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n```\r\n\r\n- Suspect discarded because of mismatched origin, if not that, will check definition\r\n - Found out that it was seed vs. output origin mismatch\r\n - Found out that BaseBranch comes from OverlayGit\r\n - Registered OverlayGit as an overlay of OverlayReadme to that it's definitions get loaded\r\n - This way `auto_flow` will make the expected origin the output from OverlayGit operations\r\n rather than seed (the default when no matching outputs are seen on DataFlow init).\r\n - We found it created an infinite loop\r\n - Will try reusing redundancy checker\r\n- Future\r\n - `run_custom` Optionally support forward subflow\r\n- TODO\r\n - [ ] Set definition proprety `AliceGitRepo.lock` to `True`",
"editedAt": "2022-07-28T19:57:10Z"
},
{
"diff": "## 2022-07-28 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n- Noticed that we have an issue with adding new files and locking. The current\r\n lock is on the `git_repository/GitRepoSpec`.\r\n - We then convert to `AliceGitRepo`, at which point anything take `AliceGitRepo`\r\n - alice: please: contribute: recommended community standards: Refactoring into overlays associated with each file contributed\r\n - Completed in 1a71dbe3ab3743430ce2783f4210a6cd807c36a1\r\n\r\n```\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed.append(dffml.Input(value=repo, definition=definition, origin=('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result')))\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed\r\n[Input(value=origin, definition=writable.github.remote.origin), Input(value=master, definition=repo.git.base.branch), Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-hxnacg5_', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]\r\n```\r\n\r\n- Attempting to figure out why an operation is not being called\r\n - `contribute_readme_md` should be getting `base`, but is not.\r\n\r\n```\r\n{'_': {ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]},\r\n 'alternate_definitions': [],\r\n 'by_origin': {('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]},\r\n 'check_for_default_value': [repo.git.base.branch],\r\n 'contexts': [MemoryInputNetworkContextEntry(ctx=Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo), definitions={ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)], repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)], ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]}, by_origin={('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]})],\r\n 'ctx': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'dataflow': <dffml.df.types.DataFlow object at 0x7f177ec3b7f0>,\r\n 'definition': repo.git.base.branch,\r\n 'gather': {'base': [],\r\n 'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]},\r\n 'handle_string': \"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', \"\r\n \"URL='https://github.com/pdxjohnny/testaaaa'), \"\r\n 'definition=ReadmeGitRepo)',\r\n 'input_flow': InputFlow(inputs={'repo': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme': 'result'}], 'base': ['seed'], 'commit_message': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message': 'result'}]}, conditions=[]),\r\n 'input_name': 'base',\r\n 'input_source': 'seed',\r\n 'input_sources': ['seed'],\r\n 'item': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'operation': Operation(name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', inputs={'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}, outputs={'result': repo.readme.git.branch}, stage=<Stage.PROCESSING: 'processing'>, conditions=[], expand=[], instance_name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', validator=False, retry=0),\r\n 'origin': 'seed',\r\n 'origins': ['seed'],\r\n 'pprint': <function pprint at 0x7f1782d94670>,\r\n 'rctx': <dffml.df.memory.MemoryRedundancyCheckerContext object at 0x7f177ec5e220>,\r\n 'self': <dffml.df.memory.MemoryInputNetworkContext object at 0x7f177ec3ba30>}\r\n> /home/pdxjohnny/Documents/python/dffml/dffml/df/memory.py(788)gather_inputs()\r\n-> return\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n(Pdb) self.ctxhd.keys()\r\ndict_keys([\"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)\"])\r\n(Pdb) from pprint import pprint\r\n(Pdb) pprint(inputs.definitions)\r\n{ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]}\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n```\r\n\r\n- Suspect discarded because of mismatched origin, if not that, will check definition\r\n - Found out that it was seed vs. output origin mismatch\r\n - Found out that BaseBranch comes from OverlayGit\r\n - Registered OverlayGit as an overlay of OverlayReadme to that it's definitions get loaded\r\n - This way `auto_flow` will make the expected origin the output from OverlayGit operations\r\n rather than seed (the default when no matching outputs are seen on DataFlow init).\r\n - We found it created an infinite loop\r\n - Will try reusing redundancy checker\r\n\r\n- TODO\r\n - `run_custom` Optionally support forward subflow",
"editedAt": "2022-07-28T19:48:23Z"
},
{
"diff": "## 2022-07-28 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n- Noticed that we have an issue with adding new files and locking. The current\r\n lock is on the `git_repository/GitRepoSpec`.\r\n - We then convert to `AliceGitRepo`, at which point anything take `AliceGitRepo`\r\n - alice: please: contribute: recommended community standards: Refactoring into overlays associated with each file contributed\r\n - Completed in 1a71dbe3ab3743430ce2783f4210a6cd807c36a1\r\n\r\n```\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed.append(dffml.Input(value=repo, definition=definition, origin=('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result')))\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed\r\n[Input(value=origin, definition=writable.github.remote.origin), Input(value=master, definition=repo.git.base.branch), Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-hxnacg5_', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]\r\n```\r\n\r\n- Attempting to figure out why an operation is not being called\r\n - `contribute_readme_md` should be getting `base`, but is not.\r\n\r\n```\r\n{'_': {ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]},\r\n 'alternate_definitions': [],\r\n 'by_origin': {('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]},\r\n 'check_for_default_value': [repo.git.base.branch],\r\n 'contexts': [MemoryInputNetworkContextEntry(ctx=Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo), definitions={ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)], repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)], ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]}, by_origin={('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]})],\r\n 'ctx': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'dataflow': <dffml.df.types.DataFlow object at 0x7f177ec3b7f0>,\r\n 'definition': repo.git.base.branch,\r\n 'gather': {'base': [],\r\n 'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]},\r\n 'handle_string': \"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', \"\r\n \"URL='https://github.com/pdxjohnny/testaaaa'), \"\r\n 'definition=ReadmeGitRepo)',\r\n 'input_flow': InputFlow(inputs={'repo': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme': 'result'}], 'base': ['seed'], 'commit_message': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message': 'result'}]}, conditions=[]),\r\n 'input_name': 'base',\r\n 'input_source': 'seed',\r\n 'input_sources': ['seed'],\r\n 'item': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'operation': Operation(name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', inputs={'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}, outputs={'result': repo.readme.git.branch}, stage=<Stage.PROCESSING: 'processing'>, conditions=[], expand=[], instance_name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', validator=False, retry=0),\r\n 'origin': 'seed',\r\n 'origins': ['seed'],\r\n 'pprint': <function pprint at 0x7f1782d94670>,\r\n 'rctx': <dffml.df.memory.MemoryRedundancyCheckerContext object at 0x7f177ec5e220>,\r\n 'self': <dffml.df.memory.MemoryInputNetworkContext object at 0x7f177ec3ba30>}\r\n> /home/pdxjohnny/Documents/python/dffml/dffml/df/memory.py(788)gather_inputs()\r\n-> return\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n(Pdb) self.ctxhd.keys()\r\ndict_keys([\"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)\"])\r\n(Pdb) from pprint import pprint\r\n(Pdb) pprint(inputs.definitions)\r\n{ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]}\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n```\r\n\r\n- Suspect discarded because of mismatched origin, if not that, will check definition\r\n - Found out that it was seed vs. output origin mismatch\r\n - Found out that BaseBranch comes from OverlayGit\r\n - Registered OverlayGit as an overlay of OverlayReadme to that it's definitions get loaded\r\n - This way `auto_flow` will make the expected origin the output from OverlayGit operations\r\n rather than seed (the default when no matching outputs are seen on DataFlow init).\r\n - \r\n\r\n- TODO\r\n - `run_custom` Optionally support forward subflow",
"editedAt": "2022-07-28T19:47:39Z"
},
{
"diff": "## 2022-07-28 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n- Noticed that we have an issue with adding new files and locking. The current\r\n lock is on the `git_repository/GitRepoSpec`.\r\n - We then convert to `AliceGitRepo`, at which point anything take `AliceGitRepo`\r\n - alice: please: contribute: recommended community standards: Refactoring into overlays associated with each file contributed\r\n - Completed in 1a71dbe3ab3743430ce2783f4210a6cd807c36a1\r\n\r\n```\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed.append(dffml.Input(value=repo, definition=definition, origin=('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result')))\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed\r\n[Input(value=origin, definition=writable.github.remote.origin), Input(value=master, definition=repo.git.base.branch), Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-hxnacg5_', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]\r\n```\r\n\r\n- Attempting to figure out why an operation is not being called\r\n - `contribute_readme_md` should be getting `base`, but is not.\r\n\r\n```\r\n{'_': {ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]},\r\n 'alternate_definitions': [],\r\n 'by_origin': {('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]},\r\n 'check_for_default_value': [repo.git.base.branch],\r\n 'contexts': [MemoryInputNetworkContextEntry(ctx=Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo), definitions={ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)], repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)], ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]}, by_origin={('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]})],\r\n 'ctx': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'dataflow': <dffml.df.types.DataFlow object at 0x7f177ec3b7f0>,\r\n 'definition': repo.git.base.branch,\r\n 'gather': {'base': [],\r\n 'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]},\r\n 'handle_string': \"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', \"\r\n \"URL='https://github.com/pdxjohnny/testaaaa'), \"\r\n 'definition=ReadmeGitRepo)',\r\n 'input_flow': InputFlow(inputs={'repo': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme': 'result'}], 'base': ['seed'], 'commit_message': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message': 'result'}]}, conditions=[]),\r\n 'input_name': 'base',\r\n 'input_source': 'seed',\r\n 'input_sources': ['seed'],\r\n 'item': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'operation': Operation(name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', inputs={'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}, outputs={'result': repo.readme.git.branch}, stage=<Stage.PROCESSING: 'processing'>, conditions=[], expand=[], instance_name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', validator=False, retry=0),\r\n 'origin': 'seed',\r\n 'origins': ['seed'],\r\n 'pprint': <function pprint at 0x7f1782d94670>,\r\n 'rctx': <dffml.df.memory.MemoryRedundancyCheckerContext object at 0x7f177ec5e220>,\r\n 'self': <dffml.df.memory.MemoryInputNetworkContext object at 0x7f177ec3ba30>}\r\n> /home/pdxjohnny/Documents/python/dffml/dffml/df/memory.py(788)gather_inputs()\r\n-> return\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n(Pdb) self.ctxhd.keys()\r\ndict_keys([\"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)\"])\r\n(Pdb) from pprint import pprint\r\n(Pdb) pprint(inputs.definitions)\r\n{ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]}\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n```\r\n\r\n- Suspect discarded because of mismatched origin, if not that, will check definition\r\n\r\n```\r\n 'origin': ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme',\r\n 'result'),\r\n 'origins': [('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme',\r\n 'result')],\r\n```\r\n\r\n- Origins check out\r\n- TODO\r\n - `run_custom` Optionally support forward subflow",
"editedAt": "2022-07-28T19:32:44Z"
},
{
"diff": "## 2022-07-28 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n- Noticed that we have an issue with adding new files and locking. The current\r\n lock is on the `git_repository/GitRepoSpec`.\r\n - We then convert to `AliceGitRepo`, at which point anything take `AliceGitRepo`\r\n - alice: please: contribute: recommended community standards: Refactoring into overlays associated with each file contributed\r\n - Completed in 1a71dbe3ab3743430ce2783f4210a6cd807c36a1\r\n\r\n```\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed.append(dffml.Input(value=repo, definition=definition, origin=('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result')))\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed\r\n[Input(value=origin, definition=writable.github.remote.origin), Input(value=master, definition=repo.git.base.branch), Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-hxnacg5_', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]\r\n```\r\n\r\n- Attempting to figure out why an operation is not being called\r\n - `contribute_readme_md` should be getting `base`, but is not.\r\n\r\n```\r\n{'_': {ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]},\r\n 'alternate_definitions': [],\r\n 'by_origin': {('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]},\r\n 'check_for_default_value': [repo.git.base.branch],\r\n 'contexts': [MemoryInputNetworkContextEntry(ctx=Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo), definitions={ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)], repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)], ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]}, by_origin={('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]})],\r\n 'ctx': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'dataflow': <dffml.df.types.DataFlow object at 0x7f177ec3b7f0>,\r\n 'definition': repo.git.base.branch,\r\n 'gather': {'base': [],\r\n 'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]},\r\n 'handle_string': \"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', \"\r\n \"URL='https://github.com/pdxjohnny/testaaaa'), \"\r\n 'definition=ReadmeGitRepo)',\r\n 'input_flow': InputFlow(inputs={'repo': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme': 'result'}], 'base': ['seed'], 'commit_message': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message': 'result'}]}, conditions=[]),\r\n 'input_name': 'base',\r\n 'input_source': 'seed',\r\n 'input_sources': ['seed'],\r\n 'item': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'operation': Operation(name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', inputs={'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}, outputs={'result': repo.readme.git.branch}, stage=<Stage.PROCESSING: 'processing'>, conditions=[], expand=[], instance_name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', validator=False, retry=0),\r\n 'origin': 'seed',\r\n 'origins': ['seed'],\r\n 'pprint': <function pprint at 0x7f1782d94670>,\r\n 'rctx': <dffml.df.memory.MemoryRedundancyCheckerContext object at 0x7f177ec5e220>,\r\n 'self': <dffml.df.memory.MemoryInputNetworkContext object at 0x7f177ec3ba30>}\r\n> /home/pdxjohnny/Documents/python/dffml/dffml/df/memory.py(788)gather_inputs()\r\n-> return\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n(Pdb) self.ctxhd.keys()\r\ndict_keys([\"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)\"])\r\n(Pdb) from pprint import pprint\r\n(Pdb) pprint(inputs.definitions)\r\n{ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]}\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n```\r\n\r\n- Suspect discarded because of mismatched origin, if not that, will check definition\r\n\r\n```\r\n 'origin': ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme',\r\n 'result'),\r\n 'origins': [('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme',\r\n 'result')],\r\n```\r\n\r\n- Origin's check out",
"editedAt": "2022-07-28T19:25:45Z"
},
{
"diff": "## 2022-07-28 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n- Noticed that we have an issue with adding new files and locking. The current\r\n lock is on the `git_repository/GitRepoSpec`.\r\n - We then convert to `AliceGitRepo`, at which point anything take `AliceGitRepo`\r\n - alice: please: contribute: recommended community standards: Refactoring into overlays associated with each file contributed\r\n - Completed in 1a71dbe3ab3743430ce2783f4210a6cd807c36a1\r\n\r\n```\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed.append(dffml.Input(value=repo, definition=definition, origin=('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result')))\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed\r\n[Input(value=origin, definition=writable.github.remote.origin), Input(value=master, definition=repo.git.base.branch), Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-hxnacg5_', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]\r\n```\r\n\r\n- Attempting to figure out why an operation is not being called\r\n - `contribute_readme_md` should be getting `base`, but is not.\r\n\r\n```\r\n{'_': {ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]},\r\n 'alternate_definitions': [],\r\n 'by_origin': {('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]},\r\n 'check_for_default_value': [repo.git.base.branch],\r\n 'contexts': [MemoryInputNetworkContextEntry(ctx=Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo), definitions={ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)], repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)], ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]}, by_origin={('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]})],\r\n 'ctx': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'dataflow': <dffml.df.types.DataFlow object at 0x7f177ec3b7f0>,\r\n 'definition': repo.git.base.branch,\r\n 'gather': {'base': [],\r\n 'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]},\r\n 'handle_string': \"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', \"\r\n \"URL='https://github.com/pdxjohnny/testaaaa'), \"\r\n 'definition=ReadmeGitRepo)',\r\n 'input_flow': InputFlow(inputs={'repo': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme': 'result'}], 'base': ['seed'], 'commit_message': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message': 'result'}]}, conditions=[]),\r\n 'input_name': 'base',\r\n 'input_source': 'seed',\r\n 'input_sources': ['seed'],\r\n 'item': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'operation': Operation(name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', inputs={'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}, outputs={'result': repo.readme.git.branch}, stage=<Stage.PROCESSING: 'processing'>, conditions=[], expand=[], instance_name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', validator=False, retry=0),\r\n 'origin': 'seed',\r\n 'origins': ['seed'],\r\n 'pprint': <function pprint at 0x7f1782d94670>,\r\n 'rctx': <dffml.df.memory.MemoryRedundancyCheckerContext object at 0x7f177ec5e220>,\r\n 'self': <dffml.df.memory.MemoryInputNetworkContext object at 0x7f177ec3ba30>}\r\n> /home/pdxjohnny/Documents/python/dffml/dffml/df/memory.py(788)gather_inputs()\r\n-> return\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n(Pdb) self.ctxhd.keys()\r\ndict_keys([\"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)\"])\r\n(Pdb) from pprint import pprint\r\n(Pdb) pprint(inputs.definitions)\r\n{ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]}\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n```\r\n\r\n- Suspect discarded because of mismatched origin, if not that, will check definition",
"editedAt": "2022-07-28T19:24:09Z"
},
{
"diff": "## 2022-07-28 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n- Noticed that we have an issue with adding new files and locking. The current\r\n lock is on the `git_repository/GitRepoSpec`.\r\n - We then convert to `AliceGitRepo`, at which point anything take `AliceGitRepo`\r\n - alice: please: contribute: recommended community standards: Refactoring into overlays associated with each file contributed\r\n - Completed in 1a71dbe3ab3743430ce2783f4210a6cd807c36a1\r\n\r\n```\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed.append(dffml.Input(value=repo, definition=definition, origin=('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result')))\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed\r\n[Input(value=origin, definition=writable.github.remote.origin), Input(value=master, definition=repo.git.base.branch), Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-hxnacg5_', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]\r\n```\r\n\r\n- Attempting to figure out why an operation is not being called\r\n - `contribute_readme_md` should be getting `base`, but is not.\r\n\r\n```\r\n{'_': {ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]},\r\n 'alternate_definitions': [],\r\n 'by_origin': {('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]},\r\n 'check_for_default_value': [repo.git.base.branch],\r\n 'contexts': [MemoryInputNetworkContextEntry(ctx=Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo), definitions={ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)], repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)], ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]}, by_origin={('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]})],\r\n 'ctx': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'dataflow': <dffml.df.types.DataFlow object at 0x7f177ec3b7f0>,\r\n 'definition': repo.git.base.branch,\r\n 'gather': {'base': [],\r\n 'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]},\r\n 'handle_string': \"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', \"\r\n \"URL='https://github.com/pdxjohnny/testaaaa'), \"\r\n 'definition=ReadmeGitRepo)',\r\n 'input_flow': InputFlow(inputs={'repo': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme': 'result'}], 'base': ['seed'], 'commit_message': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message': 'result'}]}, conditions=[]),\r\n 'input_name': 'base',\r\n 'input_source': 'seed',\r\n 'input_sources': ['seed'],\r\n 'item': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'operation': Operation(name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', inputs={'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}, outputs={'result': repo.readme.git.branch}, stage=<Stage.PROCESSING: 'processing'>, conditions=[], expand=[], instance_name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', validator=False, retry=0),\r\n 'origin': 'seed',\r\n 'origins': ['seed'],\r\n 'pprint': <function pprint at 0x7f1782d94670>,\r\n 'rctx': <dffml.df.memory.MemoryRedundancyCheckerContext object at 0x7f177ec5e220>,\r\n 'self': <dffml.df.memory.MemoryInputNetworkContext object at 0x7f177ec3ba30>}\r\n> /home/pdxjohnny/Documents/python/dffml/dffml/df/memory.py(788)gather_inputs()\r\n-> return\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n(Pdb) self.ctxhd.keys()\r\ndict_keys([\"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)\"])\r\n(Pdb) from pprint import pprint\r\n(Pdb) pprint(inputs.definitions)\r\n{ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]}\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n```",
"editedAt": "2022-07-28T19:22:43Z"
},
{
"diff": "## 2022-07-28 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n- Noticed that we have an issue with adding new files and locking. The current\r\n lock is on the `git_repository/GitRepoSpec`.\r\n - We then convert to `AliceGitRepo`, at which point anything take `AliceGitRepo`\r\n - alice: please: contribute: recommended community standards: Refactoring into overlays associated with each file contributed\r\n - Completed in 1a71dbe3ab3743430ce2783f4210a6cd807c36a1\r\n\r\n```\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed.append(dffml.Input(value=repo, definition=definition, origin=('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result')))\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed\r\n[Input(value=origin, definition=writable.github.remote.origin), Input(value=master, definition=repo.git.base.branch), Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-hxnacg5_', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]\r\n```\r\n\r\n- \r\n\r\n```\r\n{'_': {ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)],\r\n repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)],\r\n repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)]},\r\n 'alternate_definitions': [],\r\n 'by_origin': {('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)],\r\n ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]},\r\n 'check_for_default_value': [repo.git.base.branch],\r\n 'contexts': [MemoryInputNetworkContextEntry(ctx=Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo), definitions={ReadmeGitRepo: [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], writable.github.remote.origin: [Input(value=origin, definition=writable.github.remote.origin)], repo.git.base.branch: [Input(value=master, definition=repo.git.base.branch)], ReadmePath: [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ReadmeIssue: [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], repo.readme.git.commit.message: [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], github.pr.body: [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]}, by_origin={('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result'): [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGitHub:github_owns_remote', 'result'): [Input(value=origin, definition=writable.github.remote.origin)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayGit:determin_base_branch', 'result'): [Input(value=master, definition=repo.git.base.branch)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:create_readme_file_if_not_exists', 'result'): [Input(value=/tmp/dffml-feature-git-68ghk7vd/README.md, definition=ReadmePath)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_issue', 'result'): [Input(value=https://github.com/pdxjohnny/testaaaa/issues/108, definition=ReadmeIssue)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message', 'result'): [Input(value=Recommended Community Standard: README\r\n\r\nCloses: https://github.com/pdxjohnny/testaaaa/issues/108\r\n, definition=repo.readme.git.commit.message)], ('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_pr_body', 'result'): [Input(value=Closes: https://github.com/pdxjohnny/testaaaa/issues/108, definition=github.pr.body)]})],\r\n 'ctx': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'dataflow': <dffml.df.types.DataFlow object at 0x7f177ec3b7f0>,\r\n 'definition': repo.git.base.branch,\r\n 'gather': {'base': [],\r\n 'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]},\r\n 'handle_string': \"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', \"\r\n \"URL='https://github.com/pdxjohnny/testaaaa'), \"\r\n 'definition=ReadmeGitRepo)',\r\n 'input_flow': InputFlow(inputs={'repo': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme': 'result'}], 'base': ['seed'], 'commit_message': [{'alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:readme_commit_message': 'result'}]}, conditions=[]),\r\n 'input_name': 'base',\r\n 'input_source': 'seed',\r\n 'input_sources': ['seed'],\r\n 'item': Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo),\r\n 'operation': Operation(name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', inputs={'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}, outputs={'result': repo.readme.git.branch}, stage=<Stage.PROCESSING: 'processing'>, conditions=[], expand=[], instance_name='alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:contribute_readme_md', validator=False, retry=0),\r\n 'origin': 'seed',\r\n 'origins': ['seed'],\r\n 'pprint': <function pprint at 0x7f1782d94670>,\r\n 'rctx': <dffml.df.memory.MemoryRedundancyCheckerContext object at 0x7f177ec5e220>,\r\n 'self': <dffml.df.memory.MemoryInputNetworkContext object at 0x7f177ec3ba30>}\r\n> /home/pdxjohnny/Documents/python/dffml/dffml/df/memory.py(788)gather_inputs()\r\n-> return\r\n(Pdb) gather\r\n{'repo': [Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)], 'base': []}\r\n(Pdb) operation.inputs\r\n{'repo': ReadmeGitRepo, 'base': repo.git.base.branch, 'commit_message': repo.readme.git.commit.message}\r\n(Pdb) self.ctxhd.keys()\r\ndict_keys([\"Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-68ghk7vd', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)\"])\r\n```",
"editedAt": "2022-07-28T19:19:22Z"
},
{
"diff": "## 2022-07-27 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n- Noticed that we have an issue with adding new files and locking. The current\r\n lock is on the `git_repository/GitRepoSpec`.\r\n - We then convert to `AliceGitRepo`, at which point anything take `AliceGitRepo`\r\n - alice: please: contribute: recommended community standards: Refactoring into overlays associated with each file contributed\r\n - Completed in 1a71dbe3ab3743430ce2783f4210a6cd807c36a1\r\n\r\n```\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed.append(dffml.Input(value=repo, definition=definition, origin=('alice.please.contribute.recommended_community_standards.recommended_community_standards.OverlayREADME:alice_contribute_readme', 'result')))\r\n(Pdb) custom_run_dataflow_ctx.config.dataflow.seed\r\n[Input(value=origin, definition=writable.github.remote.origin), Input(value=master, definition=repo.git.base.branch), Input(value=GitRepoSpec(directory='/tmp/dffml-feature-git-hxnacg5_', URL='https://github.com/pdxjohnny/testaaaa'), definition=ReadmeGitRepo)]\r\n```",
"editedAt": "2022-07-28T11:33:03Z"
},
{
"diff": "## 2022-07-27 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n- Noticed that we have an issue with adding new files and locking. The current\r\n lock is on the `git_repository/GitRepoSpec`.\r\n - We then convert to `AliceGitRepo`, at which point anything take `AliceGitRepo`\r\n - alice: please: contribute: recommended community standards: Refactoring into overlays associated with each file contributed\r\n - Completed in 1a71dbe3ab3743430ce2783f4210a6cd807c36a1",
"editedAt": "2022-07-28T10:30:09Z"
},
{
"diff": "## 2022-07-27 @pdxjohnny Engineering Logs\r\n\r\n- TODO\r\n - [ ] Get involved in SCITT\r\n - [x] Meetings\r\n - https://docs.google.com/document/d/1vf-EliXByhg5HZfgVbTqZhfaJFCmvMdQuZ4tC-Eq6wg/edit#\r\n - Weekly Monday at 8 AM Pacific\r\n - Joining today\r\n - https://armltd.zoom.us/j/99133885299?pwd=b0w4aGorRkpjL3ZHa2NPSmRiNHpXUT09\r\n - [x] Mailing list\r\n - https://www.ietf.org/mailman/listinfo/scitt\r\n - https://mailarchive.ietf.org/arch/browse/scitt/\r\n - [ ] Slack\r\n - https://mailarchive.ietf.org/arch/msg/scitt/PbvoKOX996cNHJEOrjReaNlum64/\r\n - Going to email Orie Steele orie (at) transmute.industries to ask for an invite.\r\n - [x] Kick off OSS scans\r\n - Targeting collaboration with CRob on metrics insertion to OpenSSF DB\r\n - [ ] Finish Q3 plans (Gantt chart, meeting templates, etc.)\r\n - Generate template for auto creation to fill every meeting / fillable pre-meeting\r\n - [ ] Follow up with OneAPI folks\r\n - [ ] Overlay to `alice shouldi contribute` to create git repos when found from forks of PyPi packages\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README`\r\n - [ ] Finish out `alice please contribute recommended community standards`\r\n dynamic opimp for meta issue body creation\r\n - [ ] Associated tutorial\r\n - [ ] Linked from `README` and `CONTRIBUTING`\r\n - [ ] Software Analysis Trinity diagram showing Human Intent, Static Analysis, and Dynamic Analysis to represent the soul of the software / entity and the process taken to improve it.\r\n - [SoftwareAnalysisTrinity.drawio.xml](https://github.com/intel/dffml/files/9190063/SoftwareAnalysisTrinity.drawio.xml.txt)\r\n\r\n### Refactoring and Thinking About Locking of Repos for Contributions\r\n\r\n- Metadata\r\n - Date: 2022-07-27 20:30 UTC -7\r\n- Saving this diff which was some work on dynamic application of overlay\r\n so as to support fixup of the OpImp for `meta_issue_body()`'s inputs.\r\n - We are going to table this for now for time reasons, but if someone\r\n wants to pick it up before @pdxjohnny is back in September, please\r\n give it a go (create an issue).\r\n- Noticed that we have an issue with adding new files and locking. The current\r\n lock is on the `git_repository/GitRepoSpec`.\r\n - We then convert to `AliceGitRepo`, at which point anything take `AliceGitRepo`\r\n - alice: please: contribute: recommended community standards: Refactoring into overlays associated with each file contributed\r\n - Completed in 1a71dbe3ab3743430ce2783f4210a6cd807c36a1",
"editedAt": "2022-07-28T10:29:54Z"
}
]
}
}
]
}
},
{
"body": "# 2022-07-29 Engineering Logs\r\n\r\n- Alice PR: https://github.com/intel/dffml/pull/1401\r\n- John's last day before sabbatical\r\n - He will be in town but offline until 2022-08-29\r\n- Rolling Alice: 2022 Progress Reports: July Activities Recap: https://youtu.be/JDh2DARl8os\r\n- Alice is ready for contribution\r\n - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0001_coach_alice/0002_our_open_source_guide.md\r\n - https://github.com/intel/dffml/blob/alice/entities/alice/CONTRIBUTING.rst\r\n - Self fulfilling prophecy again! We can even automate our contributions to her even if we wanted to! She will eventually! :P\r\n- IETF\r\n - Joined SCITT WG, will rejoin in September, others please do as well!\r\n- OpenSSF\r\n - Aligned with Identifying Security Threats WG on SCITT looking like a solid direction to cross with Stream 8 for 1-2 year timeframe as Web5 space matures.\r\n- Graphics to help people get involved\r\n - https://drive.google.com/drive/folders/1E8tZT15DNjd13jVR6xqsblgLvwTZZo_f",
"createdAt": "2022-07-29T12:07:56Z",
"userContentEdits": {
"pageInfo": {
"endCursor": "Y3Vyc29yOnYyOpHOABPoBg==",
"hasNextPage": false
},
"totalCount": 12,
"nodes": [
{
"diff": "# 2022-07-29 Engineering Logs\r\n\r\n- Alice PR: https://github.com/intel/dffml/pull/1401\r\n- John's last day before sabbatical\r\n - He will be in town but offline until 2022-08-29\r\n- Rolling Alice: 2022 Progress Reports: July Activities Recap: https://youtu.be/JDh2DARl8os\r\n- Alice is ready for contribution\r\n - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0001_coach_alice/0002_our_open_source_guide.md\r\n - https://github.com/intel/dffml/blob/alice/entities/alice/CONTRIBUTING.rst\r\n - Self fulfilling prophecy again! We can even automate our contributions to her even if we wanted to! She will eventually! :P\r\n- IETF\r\n - Joined SCITT WG, will rejoin in September, others please do as well!\r\n- OpenSSF\r\n - Aligned with Identifying Security Threats WG on SCITT looking like a solid direction to cross with Stream 8 for 1-2 year timeframe as Web5 space matures.\r\n- Graphics to help people get involved\r\n - https://drive.google.com/drive/folders/1E8tZT15DNjd13jVR6xqsblgLvwTZZo_f",
"editedAt": "2022-07-29T21:09:30Z"
},
{
"diff": "# 2022-07-29 Engineering Logs\r\n\r\n- Alice PR: https://github.com/intel/dffml/pull/1401\r\n- John's last day before sabbatical\r\n - He will be in town but offline until 2022-08-29\r\n- Rolling Alice: 2022 Progress Reports: July Activities Recap\r\n - https://youtu.be/JDh2DARl8os\r\n- Alice is ready for contribution\r\n - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0001_coach_alice/0002_our_open_source_guide.md\r\n - https://github.com/intel/dffml/blob/alice/entities/alice/CONTRIBUTING.rst\r\n - Self fulfilling prophecy again! We can even automate our contributions to her even if we wanted to! She will eventually! :P\r\n- IETF\r\n - Joined SCITT WG, will rejoin in September, others please do as well!\r\n- OpenSSF\r\n - Aligned with Identifying Security Threats WG on SCITT looking like a solid direction to cross with Stream 8 for 1-2 year timeframe as Web5 space matures.\r\n- Graphics to help people get involved\r\n - https://drive.google.com/drive/folders/1E8tZT15DNjd13jVR6xqsblgLvwTZZo_f",
"editedAt": "2022-07-29T21:09:21Z"
},
{
"diff": "# 2022-07-29 Engineering Logs\r\n\r\n- Alice PR: https://github.com/intel/dffml/pull/1401\r\n- John's last day before sabbatical\r\n - He will be in town but offline until 2022-08-29\r\n- Alice is ready for contribution\r\n - https://github.com/intel/dffml/blob/alice/docs/tutorials/rolling_alice/0001_coach_alice/0002_our_open_source_guide.md\r\n - https://github.com/intel/dffml/blob/alice/entities/alice/CONTRIBUTING.rst\r\n - Self fulfilling prophecy again! We can even automate our contributions to her even if we wanted to! She will eventually! :P\r\n- IETF\r\n - Joined SCITT WG, will rejoin in September, others please do as well!\r\n- OpenSSF\r\n - Aligned with Identifying Security Threats WG on SCITT looking like a solid direction to cross with Stream 8 for 1-2 year timeframe as Web5 space matures.\r\n- Graphics to help people get involved\r\n - https://drive.google.com/drive/folders/1E8tZT15DNjd13jVR6xqsblgLvwTZZo_f",
"editedAt": "2022-07-29T19:42:16Z"
},
{
"diff": "# 2022-07-29 Engineering Logs\r\n\r\n- Alice PR: https://github.com/intel/dffml/pull/1401\r\n- John's last day before sabbatical\r\n - He will be in town but offline until 2022-08-29\r\n- Alice is ready for contribution\r\n - https://github.com/intel/dffml/blob/alice/entities/alice/CONTRIBUTING.rst\r\n - Self fulfilling prophecy again! We can even automate our contributions to her even if we wanted to! She will eventually! :P\r\n- IETF\r\n - Joined SCITT WG, will rejoin in September, others please do as well!\r\n- OpenSSF\r\n - Aligned with Identifying Security Threats WG on SCITT looking like a solid direction to cross with Stream 8 for 1-2 year timeframe as Web5 space matures.\r\n- Graphics to help people get involved\r\n - https://drive.google.com/drive/folders/1E8tZT15DNjd13jVR6xqsblgLvwTZZo_f",
"editedAt": "2022-07-29T19:35:44Z"
},
{
"diff": "# 2022-07-29 Engineering Logs\r\n\r\n- John's last day before sabbatical\r\n - He will be in town but offline until 2022-08-29\r\n- Alice is ready for contribution\r\n - https://github.com/intel/dffml/blob/alice/entities/alice/CONTRIBUTING.rst\r\n - Self fulfilling prophecy again! We can even automate our contributions to her even if we wanted to! She will eventually! :P\r\n- IETF\r\n - Joined SCITT WG, will rejoin in September, others please do as well!\r\n- OpenSSF\r\n - Aligned with Identifying Security Threats WG on SCITT looking like a solid direction to cross with Stream 8 for 1-2 year timeframe as Web5 space matures.\r\n- Graphics to help people get involved\r\n - https://drive.google.com/drive/folders/1E8tZT15DNjd13jVR6xqsblgLvwTZZo_f",
"editedAt": "2022-07-29T19:34:08Z"
},
{
"diff": "# 2022-07-29 Engineering Logs\r\n\r\n- Last day before sabbatical\r\n - Will be in town but offline until 2022-08-29\r\n- Alice is ready for contribution\r\n - https://github.com/intel/dffml/blob/alice/entities/alice/CONTRIBUTING.rst\r\n - Self fulfilling prophecy again! We can even automate our contributions to her even if we wanted to! She will eventually! :P\r\n- Graphics to help people get involved\r\n - https://drive.google.com/drive/folders/1E8tZT15DNjd13jVR6xqsblgLvwTZZo_f",
"editedAt": "2022-07-29T19:28:21Z"
},
{
"diff": "# 2022-07-29 Engineering Logs\r\n\r\n- Last day before sabbatical\r\n - Will be in town but offline until 2022-08-29\r\n- Alice is ready for contribution\r\n - \r\n- Alice is Here\r\n - https://drive.google.com/drive/folders/1E8tZT15DNjd13jVR6xqsblgLvwTZZo_f\r\n - Graphics to help people get involved",
"editedAt": "2022-07-29T19:26:53Z"
},
{
"diff": "# 2022-07-29 Engineering Logs\r\n\r\n- Alice is Here\r\n - https://drive.google.com/drive/folders/1E8tZT15DNjd13jVR6xqsblgLvwTZZo_f\r\n - Graphics to help people get involved",
"editedAt": "2022-07-29T12:21:44Z"
},
{
"diff": "# 2022-07-29 Engineering Logs\r\n\r\n- Alice is Here\r\n - https://drive.google.com/drive/folders/1E8tZT15DNjd13jVR6xqsblgLvwTZZo_f\r\n - Comms materials related to Alice to get people involved",
"editedAt": "2022-07-29T12:21:05Z"
},
{
"diff": "# 2022-07-29 Engineering Logs\r\n\r\n- Alice is Here\r\n - https://drive.google.com/drive/folders/1E8tZT15DNjd13jVR6xqsblgLvwTZZo_f\r\n - Files related to Alice to get people involved",
"editedAt": "2022-07-29T12:20:51Z"
},
{
"diff": "# 2022-07-29 Engineering Logs\r\n\r\n- Alice is Here\r\n - https://drive.google.com/drive/folders/1E8tZT15DNjd13jVR6xqsblgLvwTZZo_f\r\n - Files related to Alice",
"editedAt": "2022-07-29T12:20:35Z"
},
{
"diff": "# 2022-07-29 Engineering Logs\r\n\r\n- Alice is Here\r\n - https://drive.google.com/drive/folders/1E8tZT15DNjd13jVR6xqsblgLvwTZZo_f\r\n - Images related to Alice",
"editedAt": "2022-07-29T12:07:56Z"
}
]
},
"replies": {
"pageInfo": {
"endCursor": "Y3Vyc29yOnYyOpK5MjAyMi0wNy0yOVQwNToxNDoyNy0wNzowMM4AMgv3",
"hasNextPage": false
},
"totalCount": 1,
"nodes": [
{
"body": "## 2022-07-29 @pdxjohnny Engineering Logs\r\n\r\n- AppSec PNW 2022 Talk playlist: https://youtube.com/playlist?list=PLfoJYLR9vr_IAd1vYWdKCOO4YYpGFVv99\r\n - John^2: Living Threat Models are Better Than Dead Threat Models\r\n - Not yet uploaded but has Alice's first live demo\r\n- https://towardsdatascience.com/installing-multiple-alternative-versions-of-python-on-ubuntu-20-04-237be5177474\r\n - `$ sudo update-alternatives --install /usr/bin/python python /usr/bin/python3 40`\r\n- References\r\n - https://tenor.com/search/alice-gifs\r\n - https://tenor.com/view/why-thank-you-thanks-bow-thank-you-alice-in-wonderland-gif-3553903\r\n - Alice curtsy\r\n - https://tenor.com/view/alice-in-wonderland-gif-26127117\r\n - Alice blows out unbirthday cake candle\r\n\r\n```console\r\n$ alice; sleep 3; gif-for-cli -l 0 --rows $(tput lines) --cols $(tput cols) 3553903\r\n```\r\n\r\n```console\r\n$ gif-for-cli --rows `tput lines` --cols `tput cols` --export=alice-search-alices-adventures-in-wonderland-1.gif \"Alice curtsy\"\r\n(why-thank-you-thanks-bow-thank-you-alice-in-wonderland-gif-3553903)\r\n$ gif-for-cli --rows `tput lines` --cols `tput cols` --export=ascii-gif-alice-unbirthday-blow-out-candles-0.gif 26127117\r\n$ gif-for-cli --rows `tput lines` --cols `tput cols` ascii-gif-alice-unbirthday-blow-out-candles-0.gif\r\n$ echo gif-for-cli --rows `tput lines` --cols `tput cols`\r\ngif-for-cli --rows 97 --cols 320\r\n$ gif-for-cli -l 0 --rows `tput lines` --cols `tput cols` /mnt/c/Users/Johnny/Downloads/ascii-alices-adventures-in-wonderland-1.gif`\r\n```\r\n\r\n### Exploring a Helper Around Run DataFlow run_custom\r\n\r\n- Realized we already have the lock because it's on `git_repository` at `flow_depth=1`\r\n\r\n```diff\r\ndiff --git a/dffml/df/base.py b/dffml/df/base.py\r\nindex 4f84c1c7c8..2da0512602 100644\r\n--- a/dffml/df/base.py\r\n+++ b/dffml/df/base.py\r\n@@ -404,14 +404,19 @@ def op(\r\n )\r\n \r\n definition_name = \".\".join(name_list)\r\n+ print(\"FEEDFACE\", name, definition_name)\r\n if hasattr(param_annotation, \"__supertype__\") and hasattr(\r\n param_annotation, \"__name__\"\r\n ):\r\n+ if \"repo\" in definition_name:\r\n+ breakpoint()\r\n definition_name = param_annotation.__name__\r\n+ print(\"FEEDFACE\", name, definition_name)\r\n if inspect.isclass(param_annotation) and hasattr(\r\n param_annotation, \"__qualname__\"\r\n ):\r\n definition_name = param_annotation.__qualname__\r\n+ print(\"FEEDFACE\", name, definition_name)\r\n \r\n if isinstance(param_annotation, Definition):\r\n kwargs[\"inputs\"][name] = param_annotation\r\ndiff --git a/dffml/df/types.py b/dffml/df/types.py\r\nindex f09a8a3cea..54840f58c0 100644\r\n--- a/dffml/df/types.py\r\n+++ b/dffml/df/types.py\r\n@@ -44,6 +44,7 @@ APPLY_INSTALLED_OVERLAYS = _APPLY_INSTALLED_OVERLAYS()\r\n \r\n \r\n Expand = Union\r\n+LockReadWrite = Union\r\n \r\n \r\n primitive_types = (int, float, str, bool, dict, list, bytes)\r\n@@ -65,7 +66,7 @@ def find_primitive(new_type: Type) -> Type:\r\n )\r\n \r\n \r\n-def new_type_to_defininition(new_type: Type) -> Type:\r\n+def new_type_to_defininition(new_type: Type, lock: bool = False) -> Type:\r\n \"\"\"\r\n >>> from typing import NewType\r\n >>> from dffml import new_type_to_defininition\r\n@@ -77,6 +78,7 @@ def new_type_to_defininition(new_type: Type) -> Type:\r\n return Definition(\r\n name=new_type.__name__,\r\n primitive=find_primitive(new_type).__qualname__,\r\n+ lock=lock,\r\n links=(\r\n create_definition(\r\n find_primitive(new_type).__qualname__, new_type.__supertype__\r\n@@ -95,7 +97,28 @@ class CouldNotDeterminePrimitive(Exception):\r\n \"\"\"\r\n \r\n \r\n-def resolve_if_forward_ref(param_annotation, forward_refs_from_cls):\r\n+DEFAULT_DEFINTION_ANNOTATIONS_HANDLERS = {\r\n+ LockReadWrite: lambda definition: setattr(definition, \"lock\", True),\r\n+}\r\n+\r\n+\r\n+def resolve_if_forward_ref(\r\n+ param_annotation,\r\n+ forward_refs_from_cls,\r\n+ *,\r\n+ defintion_annotations_handlers=None,\r\n+) -> Tuple[Union[\"Definition\", Any], bool]:\r\n+ \"\"\"\r\n+ Return values:\r\n+\r\n+ param_or_definition: Union[Definition, Any]\r\n+ lock: bool\r\n+\r\n+ If the definition should be locked or not.\r\n+ \"\"\"\r\n+ if defintion_annotations_handlers is None:\r\n+ defintion_annotations_handlers = DEFAULT_DEFINTION_ANNOTATIONS_HANDLERS\r\n+ annotations = {}\r\n if isinstance(param_annotation, ForwardRef):\r\n param_annotation = param_annotation.__forward_arg__\r\n if (\r\n@@ -104,11 +127,22 @@ def resolve_if_forward_ref(param_annotation, forward_refs_from_cls):\r\n and hasattr(forward_refs_from_cls, param_annotation)\r\n ):\r\n param_annotation = getattr(forward_refs_from_cls, param_annotation)\r\n+ # Check if are in an annotation\r\n+ param_annotation_origin = get_origin(param_annotation)\r\n+ if param_annotation_origin in defintion_annotations_handlers:\r\n+ annotations[\r\n+ param_annotation_origin\r\n+ ] = defintion_annotations_handlers[param_annotation_origin]\r\n+ param_annotation = list(get_args(param_annotation))[0]\r\n+ # Create definition\r\n if hasattr(param_annotation, \"__name__\") and hasattr(\r\n param_annotation, \"__supertype__\"\r\n ):\r\n # typing.NewType support\r\n- return new_type_to_defininition(param_annotation)\r\n+ definition = new_type_to_defininition(param_annotation)\r\n+ for handler in annotations.values():\r\n+ handler(definition)\r\n+ return definition\r\n return param_annotation\r\n \r\n \r\n@@ -118,6 +152,7 @@ def _create_definition(\r\n default=NO_DEFAULT,\r\n *,\r\n forward_refs_from_cls: Optional[object] = None,\r\n+ lock: bool = False,\r\n ):\r\n param_annotation = resolve_if_forward_ref(\r\n param_annotation, forward_refs_from_cls\r\n@@ -138,12 +173,14 @@ def _create_definition(\r\n elif get_origin(param_annotation) in [\r\n Union,\r\n collections.abc.AsyncIterator,\r\n+ LockReadWrite,\r\n ]:\r\n # If the annotation is of the form Optional\r\n return create_definition(\r\n name,\r\n list(get_args(param_annotation))[0],\r\n forward_refs_from_cls=forward_refs_from_cls,\r\n+ lock=bool(get_origin(param_annotation) in (LockReadWrite,),),\r\n )\r\n elif (\r\n get_origin(param_annotation) is list\r\n@@ -235,6 +272,7 @@ def create_definition(\r\n default=NO_DEFAULT,\r\n *,\r\n forward_refs_from_cls: Optional[object] = None,\r\n+ lock: bool = False,\r\n ):\r\n if hasattr(param_annotation, \"__name__\") and hasattr(\r\n param_annotation, \"__supertype__\"\r\n@@ -246,6 +284,7 @@ def create_definition(\r\n param_annotation,\r\n default=default,\r\n forward_refs_from_cls=forward_refs_from_cls,\r\n+ lock=lock,\r\n )\r\n # We can guess name if converting from NewType. However, we can't otherwise.\r\n if not definition.name:\r\n@@ -847,7 +886,9 @@ class DataFlow:\r\n for operation in args:\r\n name = getattr(getattr(operation, \"op\", operation), \"name\")\r\n if name in operations:\r\n- raise ValueError(f\"Operation {name} given as positional and in dict\")\r\n+ raise ValueError(\r\n+ f\"Operation {name} given as positional and in dict\"\r\n+ )\r\n operations[name] = operation\r\n \r\n self.operations = operations\r\ndiff --git a/entities/alice/alice/please/contribute/recommended_community_standards/recommended_community_standards.py b/entities/alice/alice/please/contribute/recommended_community_standards/recommended_community_standards.py\r\nindex 825f949d65..0ff7e11c31 100644\r\n--- a/entities/alice/alice/please/contribute/recommended_community_standards/recommended_community_standards.py\r\n+++ b/entities/alice/alice/please/contribute/recommended_community_standards/recommended_community_standards.py\r\n@@ -8,18 +8,21 @@ import dffml\r\n import dffml_feature_git.feature.definitions\r\n \r\n \r\n-class AliceGitRepo(NamedTuple):\r\n+class AliceGitRepoSpec(NamedTuple):\r\n directory: str\r\n URL: str\r\n \r\n \r\n+AliceGitRepo = dffml.LockReadWrite[AliceGitRepoSpec]\r\n+\r\n+\r\n class AliceGitRepoInputSetContextHandle(dffml.BaseContextHandle):\r\n def as_string(self) -> str:\r\n return str(self.ctx.repo)\r\n \r\n \r\n class AliceGitRepoInputSetContext(dffml.BaseInputSetContext):\r\n- def __init__(self, repo: AliceGitRepo):\r\n+ def __init__(self, repo: AliceGitRepoSpec):\r\n self.repo = repo\r\n \r\n async def handle(self) -> AliceGitRepoInputSetContextHandle:\r\n```\r\n\r\n- Is this the same as what we had in c89d3d8444cdad248fce5a7fff959c9ea48a7c9d ?\r\n\r\n```python\r\n async def alice_contribute_readme(self, repo: AliceGitRepo) -> ReadmeGitRepo:\r\n key, definition = list(self.parent.op.outputs.items())[0]\r\n await self.octx.ictx.cadd(\r\n AliceGitRepoInputSetContext(repo),\r\n dffml.Input(\r\n value=repo,\r\n definition=definition,\r\n parents=None,\r\n origin=(self.parent.op.instance_name, key),\r\n )\r\n )\r\n```\r\n\r\n```diff\r\ndiff --git a/entities/alice/alice/please/contribute/recommended_community_standards/recommended_community_standards.py b/entities/alice/alice/please/contribute/recommended_community_standards/recommended_community_standards.py\r\nindex 825f949d65..1bc1c41e50 100644\r\n--- a/entities/alice/alice/please/contribute/recommended_community_standards/recommended_community_standards.py\r\n+++ b/entities/alice/alice/please/contribute/recommended_community_standards/recommended_community_standards.py\r\n@@ -203,30 +203,22 @@ class OverlayREADME:\r\n ReadmePRBody = NewType(\"github.pr.body\", str)\r\n\r\n # async def cli_run_on_repo(self, repo: \"CLIRunOnRepo\"):\r\n- async def alice_contribute_readme(self, repo: AliceGitRepo) -> ReadmeGitRepo:\r\n- # TODO Clean this up once SystemContext refactor complete\r\n- readme_dataflow_cls_upstream = OverlayREADME\r\n- readme_dataflow_cls_overlays = dffml.Overlay.load(\r\n- entrypoint=\"dffml.overlays.alice.please.contribute.recommended_community_standards.overlay.readme\"\r\n- )\r\n- readme_dataflow_upstream = dffml.DataFlow(\r\n- *dffml.object_to_operations(readme_dataflow_cls_upstream)\r\n- )\r\n+ async def new_context(self, repo: AliceGitRepo) -> ReadmeGitRepo:\r\n+ return\r\n # auto_flow with overlays\r\n- readme_dataflow = dffml.DataFlow(\r\n+ dataflow = dffml.DataFlow(\r\n *itertools.chain(\r\n *[\r\n dffml.object_to_operations(cls)\r\n for cls in [\r\n- readme_dataflow_cls_upstream,\r\n- *readme_dataflow_cls_overlays,\r\n+ upstream,\r\n+ *overlays,\r\n ]\r\n ]\r\n )\r\n )\r\n async with dffml.run_dataflow.imp(\r\n- # dataflow=self.octx.config.dataflow,\r\n- dataflow=readme_dataflow,\r\n+ dataflow=dataflow,\r\n input_set_context_cls=AliceGitRepoInputSetContext,\r\n ) as custom_run_dataflow:\r\n # Copy all inputs from parent context into child. We eventually\r\n@@ -277,6 +269,18 @@ class OverlayREADME:\r\n },\r\n )\r\n\r\n+ async def alice_contribute_readme(self, repo: AliceGitRepo) -> ReadmeGitRepo:\r\n+ key, definition = list(self.parent.op.outputs.items())[0]\r\n+ await self.octx.ictx.cadd(\r\n+ AliceGitRepoInputSetContext(repo),\r\n+ dffml.Input(\r\n+ value=repo,\r\n+ definition=definition,\r\n+ parents=None,\r\n+ origin=(self.parent.op.instance_name, key),\r\n+ )\r\n+ )\r\n+\r\n # TODO Run this system context where readme contexts is given on CLI or\r\n # overriden via disabling of static overlay and application of overlay to\r\n # generate contents dynamiclly.\r\n```\r\n\r\n- Visualize the flow before we attempt to add `CONTRIBUTING.md` contribution\r\n\r\n```console\r\n$ dffml service dev export alice.cli:AlicePleaseContributeCLIDataFlow | tee alice.please.contribute.recommended_community_standards.json\r\n$ (echo -e 'HTTP/1.0 200 OK\\n' && dffml dataflow diagram -shortname alice.please.contribute.recommended_community_standards.json) | nc -Nlp 9999;\r\n```\r\n\r\n```mermaid\r\ngraph TD\r\nsubgraph a759a07029077edc5c37fea0326fa281[Processing Stage]\r\nstyle a759a07029077edc5c37fea0326fa281 fill:#afd388b5,stroke:#a4ca7a\r\nsubgraph 8cfb8cd5b8620de4a7ebe0dfec00771a[cli_has_repos]\r\nstyle 8cfb8cd5b8620de4a7ebe0dfec00771a fill:#fff4de,stroke:#cece71\r\nd493c90433d19f11f33c2d72cd144940[cli_has_repos]\r\ne07552ee3b6b7696cb3ddd786222eaad(cmd)\r\ne07552ee3b6b7696cb3ddd786222eaad --> d493c90433d19f11f33c2d72cd144940\r\ncee6b5fdd0b6fbd0539cdcdc7f5a3324(wanted)\r\ncee6b5fdd0b6fbd0539cdcdc7f5a3324 --> d493c90433d19f11f33c2d72cd144940\r\n79e1ea6822bff603a835fb8ee80c7ff3(result)\r\nd493c90433d19f11f33c2d72cd144940 --> 79e1ea6822bff603a835fb8ee80c7ff3\r\nend\r\nsubgraph 0c2b64320fb5666a034794bb2195ecf0[cli_is_asking_for_recommended_community_standards]\r\nstyle 0c2b64320fb5666a034794bb2195ecf0 fill:#fff4de,stroke:#cece71\r\n222ee6c0209f1f1b7a782bc1276868c7[cli_is_asking_for_recommended_community_standards]\r\n330f463830aa97e88917d5a9d1c21500(cmd)\r\n330f463830aa97e88917d5a9d1c21500 --> 222ee6c0209f1f1b7a782bc1276868c7\r\nba29b52e9c5aa88ea1caeeff29bfd491(result)\r\n222ee6c0209f1f1b7a782bc1276868c7 --> ba29b52e9c5aa88ea1caeeff29bfd491\r\nend\r\nsubgraph eac58e8db2b55cb9cc5474aaa402c93e[cli_is_meant_on_this_repo]\r\nstyle eac58e8db2b55cb9cc5474aaa402c93e fill:#fff4de,stroke:#cece71\r\n6c819ad0228b0e7094b33e0634da9a38[cli_is_meant_on_this_repo]\r\ndc7c5f0836f7d2564c402bf956722672(cmd)\r\ndc7c5f0836f7d2564c402bf956722672 --> 6c819ad0228b0e7094b33e0634da9a38\r\n58d8518cb0d6ef6ad35dc242486f1beb(wanted)\r\n58d8518cb0d6ef6ad35dc242486f1beb --> 6c819ad0228b0e7094b33e0634da9a38\r\n135ee61e3402d6fcbd7a219b0b4ccd73(result)\r\n6c819ad0228b0e7094b33e0634da9a38 --> 135ee61e3402d6fcbd7a219b0b4ccd73\r\nend\r\nsubgraph 37887bf260c5c8e9bd18038401008bbc[cli_run_on_repo]\r\nstyle 37887bf260c5c8e9bd18038401008bbc fill:#fff4de,stroke:#cece71\r\n9d1042f33352800e54d98c9c5a4223df[cli_run_on_repo]\r\ne824ae072860bc545fc7d55aa0bca479(repo)\r\ne824ae072860bc545fc7d55aa0bca479 --> 9d1042f33352800e54d98c9c5a4223df\r\n40109d487bb9f08608d8c5f6e747042f(result)\r\n9d1042f33352800e54d98c9c5a4223df --> 40109d487bb9f08608d8c5f6e747042f\r\nend\r\nsubgraph 66ecd0c1f2e08941c443ec9cd89ec589[guess_repo_string_is_directory]\r\nstyle 66ecd0c1f2e08941c443ec9cd89ec589 fill:#fff4de,stroke:#cece71\r\n737d719a0c348ff65456024ddbc530fe[guess_repo_string_is_directory]\r\n33d806f9b732bfd6b96ae2e9e4243a68(repo_string)\r\n33d806f9b732bfd6b96ae2e9e4243a68 --> 737d719a0c348ff65456024ddbc530fe\r\ndd5aab190ce844673819298c5b8fde76(result)\r\n737d719a0c348ff65456024ddbc530fe --> dd5aab190ce844673819298c5b8fde76\r\nend\r\nsubgraph 2bcd191634373f4b97ecb9546df23ee5[alice_contribute_contributing]\r\nstyle 2bcd191634373f4b97ecb9546df23ee5 fill:#fff4de,stroke:#cece71\r\na2541ce40b2e5453e8e919021011e5e4[alice_contribute_contributing]\r\n3786b4af914402320d260d077844620e(repo)\r\n3786b4af914402320d260d077844620e --> a2541ce40b2e5453e8e919021011e5e4\r\nda4270ecc44b6d9eed9809a560d24a28(result)\r\na2541ce40b2e5453e8e919021011e5e4 --> da4270ecc44b6d9eed9809a560d24a28\r\nend\r\nsubgraph 13b430e6b93de7e40957165687f8e593[contribute_contributing_md]\r\nstyle 13b430e6b93de7e40957165687f8e593 fill:#fff4de,stroke:#cece71\r\nff8f8968322872ccc3cf151d167e22a2[contribute_contributing_md]\r\n4f752ce18209f62ed749e88dd1f70266(base)\r\n4f752ce18209f62ed749e88dd1f70266 --> ff8f8968322872ccc3cf151d167e22a2\r\n2def8c6923c832adf33989b26c91295a(commit_message)\r\n2def8c6923c832adf33989b26c91295a --> ff8f8968322872ccc3cf151d167e22a2\r\nf5548fcbcec8745ddf04104fc78e83a3(repo)\r\nf5548fcbcec8745ddf04104fc78e83a3 --> ff8f8968322872ccc3cf151d167e22a2\r\n24292ae12efd27a227a0d6368ba01faa(result)\r\nff8f8968322872ccc3cf151d167e22a2 --> 24292ae12efd27a227a0d6368ba01faa\r\nend\r\nsubgraph 71a5f33f393735fa1cc91419b43db115[contributing_commit_message]\r\nstyle 71a5f33f393735fa1cc91419b43db115 fill:#fff4de,stroke:#cece71\r\nd034a42488583464e601bcaee619a539[contributing_commit_message]\r\nc0a0fa68a872adf890ed639e07ed5882(issue_url)\r\nc0a0fa68a872adf890ed639e07ed5882 --> d034a42488583464e601bcaee619a539\r\nce14ca2191f2b1c13c605b240e797255(result)\r\nd034a42488583464e601bcaee619a539 --> ce14ca2191f2b1c13c605b240e797255\r\nend\r\nsubgraph db8a1253cc59982323848f5e42c23c9d[contributing_issue]\r\nstyle db8a1253cc59982323848f5e42c23c9d fill:#fff4de,stroke:#cece71\r\nc39bd2cc88723432048c434fdd337eab[contributing_issue]\r\n821d21e8a69d1fa1757147e7e768f306(body)\r\n821d21e8a69d1fa1757147e7e768f306 --> c39bd2cc88723432048c434fdd337eab\r\n0581b90c76b0a4635a968682b060abff(repo)\r\n0581b90c76b0a4635a968682b060abff --> c39bd2cc88723432048c434fdd337eab\r\n809719538467f6d0bf18f7ae26f08d80(title)\r\n809719538467f6d0bf18f7ae26f08d80 --> c39bd2cc88723432048c434fdd337eab\r\nc9f2ea5a7f25b3ae9fbf5041be5fa071(result)\r\nc39bd2cc88723432048c434fdd337eab --> c9f2ea5a7f25b3ae9fbf5041be5fa071\r\nend\r\nsubgraph 1e6046d1a567bf390566b1b995df9dcf[contributing_pr]\r\nstyle 1e6046d1a567bf390566b1b995df9dcf fill:#fff4de,stroke:#cece71\r\n4ec1433342f2f12ab8c59efab20e7b06[contributing_pr]\r\nbb85c3467b05192c99a3954968c7a612(base)\r\nbb85c3467b05192c99a3954968c7a612 --> 4ec1433342f2f12ab8c59efab20e7b06\r\n77f6c1c6b7ee62881b49c289097dfbde(body)\r\n77f6c1c6b7ee62881b49c289097dfbde --> 4ec1433342f2f12ab8c59efab20e7b06\r\na0a2fabc65fe5601c7ea289124d04f70(head)\r\na0a2fabc65fe5601c7ea289124d04f70 --> 4ec1433342f2f12ab8c59efab20e7b06\r\ncf92708915b9f41cb490b991abd6c374(origin)\r\ncf92708915b9f41cb490b991abd6c374 --> 4ec1433342f2f12ab8c59efab20e7b06\r\n210ae36c85f3597c248e0b32da7661ae(repo)\r\n210ae36c85f3597c248e0b32da7661ae --> 4ec1433342f2f12ab8c59efab20e7b06\r\n1700dc637c25bd503077a2a1422142e2(title)\r\n1700dc637c25bd503077a2a1422142e2 --> 4ec1433342f2f12ab8c59efab20e7b06\r\n806e8c455d2bb7ad68112d2a7e16eed6(result)\r\n4ec1433342f2f12ab8c59efab20e7b06 --> 806e8c455d2bb7ad68112d2a7e16eed6\r\nend\r\nsubgraph 04c27c13241164ae88456c1377995897[contributing_pr_body]\r\nstyle 04c27c13241164ae88456c1377995897 fill:#fff4de,stroke:#cece71\r\na3cebe78451142664930d44ad4d7d181[contributing_pr_body]\r\n6118470d0158ef1a220fe7c7232e1b63(contributing_issue)\r\n6118470d0158ef1a220fe7c7232e1b63 --> a3cebe78451142664930d44ad4d7d181\r\n99a7dd1ae037153eef80e1dee51b9d2b(result)\r\na3cebe78451142664930d44ad4d7d181 --> 99a7dd1ae037153eef80e1dee51b9d2b\r\nend\r\nsubgraph 0d4627f8d8564b6c4ba33c12dcb58fc1[contributing_pr_title]\r\nstyle 0d4627f8d8564b6c4ba33c12dcb58fc1 fill:#fff4de,stroke:#cece71\r\nbfa172a9399604546048d60db0a36187[contributing_pr_title]\r\n0fd26f9166ccca10c68e9aefa9c15767(contributing_issue)\r\n0fd26f9166ccca10c68e9aefa9c15767 --> bfa172a9399604546048d60db0a36187\r\n77a2f9d4dfad5f520f1502e8ba70e47a(result)\r\nbfa172a9399604546048d60db0a36187 --> 77a2f9d4dfad5f520f1502e8ba70e47a\r\nend\r\nsubgraph c67b92ef6a2e025ca086bc2f89d9afbb[create_contributing_file_if_not_exists]\r\nstyle c67b92ef6a2e025ca086bc2f89d9afbb fill:#fff4de,stroke:#cece71\r\n993a1fe069a02a45ba3579b1902b2a36[create_contributing_file_if_not_exists]\r\n401c179bb30b24c2ca989c64d0b1cdc7(contributing_contents)\r\n401c179bb30b24c2ca989c64d0b1cdc7 --> 993a1fe069a02a45ba3579b1902b2a36\r\ndde78f81b1bdfe02c0a2bf6e51f65cb4(repo)\r\ndde78f81b1bdfe02c0a2bf6e51f65cb4 --> 993a1fe069a02a45ba3579b1902b2a36\r\ne5b8d158dc0ec476dbbd44549a981815(result)\r\n993a1fe069a02a45ba3579b1902b2a36 --> e5b8d158dc0ec476dbbd44549a981815\r\nend\r\nsubgraph 4ea6696419c4a0862a4f63ea1f60c751[create_branch_if_none_exists]\r\nstyle 4ea6696419c4a0862a4f63ea1f60c751 fill:#fff4de,stroke:#cece71\r\n502369b37882b300d6620d5b4020f5b2[create_branch_if_none_exists]\r\nfdcb9b6113856222e30e093f7c38065e(name)\r\nfdcb9b6113856222e30e093f7c38065e --> 502369b37882b300d6620d5b4020f5b2\r\nbdcf4b078985f4a390e4ed4beacffa65(repo)\r\nbdcf4b078985f4a390e4ed4beacffa65 --> 502369b37882b300d6620d5b4020f5b2\r\n5a5493ab86ab4053f1d44302e7bdddd6(result)\r\n502369b37882b300d6620d5b4020f5b2 --> 5a5493ab86ab4053f1d44302e7bdddd6\r\nend\r\nsubgraph b1d510183f6a4c3fde207a4656c72cb4[determin_base_branch]\r\nstyle b1d510183f6a4c3fde207a4656c72cb4 fill:#fff4de,stroke:#cece71\r\n476aecd4d4d712cda1879feba46ea109[determin_base_branch]\r\nff47cf65b58262acec28507f4427de45(default_branch)\r\nff47cf65b58262acec28507f4427de45 --> 476aecd4d4d712cda1879feba46ea109\r\n150204cd2d5a921deb53c312418379a1(result)\r\n476aecd4d4d712cda1879feba46ea109 --> 150204cd2d5a921deb53c312418379a1\r\nend\r\nsubgraph 2a08ff341f159c170b7fe017eaad2f18[git_repo_to_alice_git_repo]\r\nstyle 2a08ff341f159c170b7fe017eaad2f18 fill:#fff4de,stroke:#cece71\r\n7f74112f6d30c6289caa0a000e87edab[git_repo_to_alice_git_repo]\r\ne58180baf478fe910359358a3fa02234(repo)\r\ne58180baf478fe910359358a3fa02234 --> 7f74112f6d30c6289caa0a000e87edab\r\n9b92d5a346885079a2821c4d27cb5174(result)\r\n7f74112f6d30c6289caa0a000e87edab --> 9b92d5a346885079a2821c4d27cb5174\r\nend\r\nsubgraph b5d35aa8a8dcd28d22d47caad02676b0[guess_repo_string_is_url]\r\nstyle b5d35aa8a8dcd28d22d47caad02676b0 fill:#fff4de,stroke:#cece71\r\n0de074e71a32e30889b8bb400cf8db9f[guess_repo_string_is_url]\r\nc3bfe79b396a98ce2d9bfe772c9c20af(repo_string)\r\nc3bfe79b396a98ce2d9bfe772c9c20af --> 0de074e71a32e30889b8bb400cf8db9f\r\n2a1c620b0d510c3d8ed35deda41851c5(result)\r\n0de074e71a32e30889b8bb400cf8db9f --> 2a1c620b0d510c3d8ed35deda41851c5\r\nend\r\nsubgraph 60791520c6d124c0bf15e599132b0caf[guessed_repo_string_is_operations_git_url]\r\nstyle 60791520c6d124c0bf15e599132b0caf fill:#fff4de,stroke:#cece71\r\n102f173505d7b546236cdeff191369d4[guessed_repo_string_is_operations_git_url]\r\n4934c6211334318c63a5e91530171c9b(repo_url)\r\n4934c6211334318c63a5e91530171c9b --> 102f173505d7b546236cdeff191369d4\r\n8d0adc31da1a0919724baf73d047743c(result)\r\n102f173505d7b546236cdeff191369d4 --> 8d0adc31da1a0919724baf73d047743c\r\nend\r\nsubgraph f2c7b93622447999daab403713239ada[guessed_repo_string_means_no_git_branch_given]\r\nstyle f2c7b93622447999daab403713239ada fill:#fff4de,stroke:#cece71\r\nc8294a87e7aae8f7f9cb7f53e054fed5
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment