Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REVIEW]: Learning Machine Learning with Lorenz-96 #241

Open
editorialbot opened this issue Mar 27, 2024 · 85 comments
Open

[REVIEW]: Learning Machine Learning with Lorenz-96 #241

editorialbot opened this issue Mar 27, 2024 · 85 comments
Assignees
Labels
Jupyter Notebook Python recommend-accept Papers recommended for acceptance in JOSE. review TeX

Comments

@editorialbot
Copy link
Collaborator

editorialbot commented Mar 27, 2024

Submitting author: @dhruvbalwada (Dhruv Balwada)
Repository: https://github.com/m2lines/L96_demo
Branch with paper.md (empty if default branch):
Version: v1.0.3
Editor: @magsol
Reviewers: @Micky774, @AnonymousFool
Archive: 10.5281/zenodo.13357587
Paper kind: learning module

Status

status

Status badge code:

HTML: <a href="https://jose.theoj.org/papers/c644a0264f445698f212a051d8ace6e8"><img src="https://jose.theoj.org/papers/c644a0264f445698f212a051d8ace6e8/status.svg"></a>
Markdown: [![status](https://jose.theoj.org/papers/c644a0264f445698f212a051d8ace6e8/status.svg)](https://jose.theoj.org/papers/c644a0264f445698f212a051d8ace6e8)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@Micky774 & @AnonymousFool, your review will be checklist based. Each of you will have a separate checklist that you should update when carrying out your review.
First of all you need to run this command in a separate comment to create the checklist:

@editorialbot generate my checklist

The reviewer guidelines are available here: https://openjournals.readthedocs.io/en/jose/reviewer_guidelines.html. Any questions/concerns please let @magsol know.

Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest

Checklists

📝 Checklist for @Micky774

📝 Checklist for @AnonymousFool

@editorialbot
Copy link
Collaborator Author

Hello humans, I'm @editorialbot, a robot that can help you with some common editorial tasks.

For a list of things I can do to help you, just type:

@editorialbot commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@editorialbot generate pdf

@editorialbot
Copy link
Collaborator Author

Software report:

github.com/AlDanial/cloc v 1.90  T=0.08 s (584.4 files/s, 331163.2 lines/s)
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
Jupyter Notebook                28              0          16704           7734
Python                           6            405            581           1335
TeX                              2             37              1            388
Markdown                         6             71              0            290
YAML                             5             10             27            183
SVG                              2              0              0              2
-------------------------------------------------------------------------------
SUM:                            49            523          17313           9932
-------------------------------------------------------------------------------

Commit count by author:

    58	Shubham Gupta
    57	Alistair Adcroft
    45	Ryan Abernathey
    29	Shantanu Acharya
    25	pre-commit-ci[bot]
    23	dhruvbalwada
    20	Dhruv Balwada
    17	Mohamed Aziz Bhouri
    16	Johanna Goldman
    14	Laure Zanna
     9	Brandon Reichl
     7	Feiyu Lu
     7	Yani Yoval
     5	Nora Loose
     5	Pierre Gentine
     4	lesommer
     3	Andrew Ross
     3	Arthur
     3	Lorenzo Zampieri
     3	Ziwei Li
     2	Mitch Bushuk
     2	Sara Shamekh
     1	Alex Connolly
     1	William-gregory
     1	chzhangudel

@editorialbot
Copy link
Collaborator Author

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- None

MISSING DOIs

- 10.1017/cbo9780511617652.004 may be a valid DOI for title: Predictability: a problem partly solved

INVALID DOIs

- None

@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@editorialbot
Copy link
Collaborator Author

Paper file info:

📄 Wordcount for paper.md is 1350

🔴 Failed to discover a Statement of need section in paper

@editorialbot
Copy link
Collaborator Author

License info:

✅ License found: MIT License (Valid open source OSI approved license)

@Micky774
Copy link

Micky774 commented Mar 27, 2024

Review checklist for @Micky774

Conflict of interest

Code of Conduct

General checks

  • Repository: Is the source for this learning module available at the https://github.com/m2lines/L96_demo?
  • License: Does the repository contain a plain-text LICENSE file with the contents of a standard license? (OSI-approved for code, Creative Commons for content)
  • Version: Does the release version given match the repository release?
  • Authorship: Has the submitting author (@dhruvbalwada) made visible contributions to the module? Does the full list of authors seem appropriate and complete?

Documentation

  • A statement of need: Do the authors clearly state the need for this module and who the target audience is?
  • Installation instructions: Is there a clearly stated list of dependencies?
  • Usage: Does the documentation explain how someone would adopt the module, and include examples of how to use it?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the module 2) Report issues or problems with the module 3) Seek support

Pedagogy / Instructional design (Work-in-progress: reviewers, please comment!)

  • Learning objectives: Does the module make the learning objectives plainly clear? (We don't require explicitly written learning objectives; only that they be evident from content and design.)
  • Content scope and length: Is the content substantial for learning a given topic? Is the length of the module appropriate?
  • Pedagogy: Does the module seem easy to follow? Does it observe guidance on cognitive load? (working memory limits of 7 +/- 2 chunks of information)
  • Content quality: Is the writing of good quality, concise, engaging? Are the code components well crafted? Does the module seem complete?
  • Instructional design: Is the instructional design deliberate and apparent? For example, exploit worked-example effects; effective multi-media use; low extraneous cognitive load.

JOSE paper

  • Authors: Does the paper.md file include a list of authors with their affiliations?
  • A statement of need: Does the paper clearly state the need for this module and who the target audience is?
  • Description: Does the paper describe the learning materials and sequence?
  • Does it describe how it has been used in the classroom or other settings, and how someone might adopt it?
  • Could someone else teach with this module, given the right expertise?
  • Does the paper tell the "story" of how the authors came to develop it, or what their expertise is?
  • References: Do all archival references that should have a DOI list one (e.g., papers, datasets, software)?

@magsol
Copy link

magsol commented Apr 15, 2024

Hey @Micky774 @AnonymousFool 👋 Wanted to check in on the status of your reviews, see if you needed anything or if there are any roadblocks I can help troubleshoot. Thanks!

@AnonymousFool
Copy link

Oh my god, well this fell off my radar somehow. That was irresponsible of me. Mea culpa.

I've got too much scheduled today to work on it, so I'll start work in earnest tomorrow.

@AnonymousFool
Copy link

AnonymousFool commented Apr 18, 2024

Review checklist for @AnonymousFool

Conflict of interest

Code of Conduct

General checks

  • Repository: Is the source for this learning module available at the https://github.com/m2lines/L96_demo?
  • License: Does the repository contain a plain-text LICENSE file with the contents of a standard license? (OSI-approved for code, Creative Commons for content)
  • Version: Does the release version given match the repository release?
  • Authorship: Has the submitting author (@dhruvbalwada) made visible contributions to the module? Does the full list of authors seem appropriate and complete?

Documentation

  • A statement of need: Do the authors clearly state the need for this module and who the target audience is?
  • Installation instructions: Is there a clearly stated list of dependencies?
  • Usage: Does the documentation explain how someone would adopt the module, and include examples of how to use it?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the module 2) Report issues or problems with the module 3) Seek support

Pedagogy / Instructional design (Work-in-progress: reviewers, please comment!)

  • Learning objectives: Does the module make the learning objectives plainly clear? (We don't require explicitly written learning objectives; only that they be evident from content and design.)
  • Content scope and length: Is the content substantial for learning a given topic? Is the length of the module appropriate?
  • Pedagogy: Does the module seem easy to follow? Does it observe guidance on cognitive load? (working memory limits of 7 +/- 2 chunks of information)
  • Content quality: Is the writing of good quality, concise, engaging? Are the code components well crafted? Does the module seem complete?
  • Instructional design: Is the instructional design deliberate and apparent? For example, exploit worked-example effects; effective multi-media use; low extraneous cognitive load.

JOSE paper

  • Authors: Does the paper.md file include a list of authors with their affiliations?
  • A statement of need: Does the paper clearly state the need for this module and who the target audience is?
  • Description: Does the paper describe the learning materials and sequence?
  • Does it describe how it has been used in the classroom or other settings, and how someone might adopt it?
  • Could someone else teach with this module, given the right expertise?
  • Does the paper tell the "story" of how the authors came to develop it, or what their expertise is?
  • References: Do all archival references that should have a DOI list one (e.g., papers, datasets, software)?

@Micky774
Copy link

Micky774 commented Apr 20, 2024

Sorry for the delay, and thank you for your patience. I will be performing the first part of my review today, and hope to complete a full round by tomorrow evening, circumstances permitting.

@Micky774
Copy link

Once again, sorry for the delay @dhruvbalwada. The good news is that the vast majority of the non-pedagogical components are already in a fantastic state, and there is no core content missing. If anything, most of these suggestions are to round out the existing content and offer some more concrete and explicit communication which future learners can benefit from. Below is my first-pass of the non-pedagogical sections.

If you have any questions about the feedback, please feel free to let me know! In particular, if there is something you'd like a more detailed discussion and dissection of, it would probably be best to open an issue in your repository corresponding to the specific piece of feedback that needs clarification. We can continue a more detailed discussion there and simply link back to it in this thread for brevity/clarity.


Non-pedagogical components review

General checks

  • Please create an initial release in the repository. For details, see the github docs. This should match the version provided in your application, i.e. v1.0

Documentation

  • Your README.md lacks a clear statement of need. The easiest resolution would be to add a small section describing a specific (but perhaps non-exhaustive) list of folks that may benefit from this content. You describe this a bit in your paper, albeit slightly scattered, so it should be fairly easy to add. In particular it would be beneficial to specify if there is any prior knowledge required for making full use of this module.
  • In a similar vein, while you provide instructions for building/serving the content, the readme lacks a discussion on the contextual use of the repository. Please add some words offering instructions or recommendations for using the repository as a teaching tool itself, e.g. a recommended pace/timeline, or potential adaptation of the content to suit specific needs (this is less obvious and may not be appropriate).
  • While your documentation includes instructions for contribution to the module, it does not provide instructions for reporting problems or obtaining support. This could be as simple as directing them to open an issue in the repository and perhaps including a code of conduct if appropriate. Optionally you may provide either individual or organizational contact information if there is a commitment to maintenance / support, but this is not strictly necessary.

JOSE paper

  • Your paper lacks a clear statement of need. Most of the content that would comprise the statement of need is present in the submission, however it is scattered and should instead be explicitly included in a separate section.
  • Please source any data or external models you may be using as a core part of the module (as opposed to transient or one-time use).
  • Please include some more context regarding the tooling your module covers, and its role in the field. Specifically, please explicitly mention and cite some other models/solutions that accomplish similar tasks to the L96 model your module focuses on. It is reasonable to expect future users to gain much value from a submission that includes relevant citations as they can use those citations as future reading.

@AnonymousFool
Copy link

AnonymousFool commented May 6, 2024

Alright, I've done a run through of all the required material for the review. I agree with Meekail's feedback thus far, and I found one additional issue with respect to the non-pedagogical requirements that I've documented here.

With respect to the pedagogical content, I think that the structure, ordering, and pacing of ideas throughout the notebooks is impeccable. I think though that there are a lot of small edits I could make to various sentences and formulae to improve their precision and clarity.

I think the most productive and easiest way to deliver and discuss the feedback would be if I made a new branch of the repository in which I commit the edit ideas as changes to the notebooks. Then I can open a pull request, and we can use github's comment and suggestion infrastructure to organize discussion of the feedback. If you, on review, found the feedback valuable, then you can just merge the changes in.

I've also noticed a lot of small typos and grammatical errors throughout the notebooks, none of which affected my ability to understand the ideas the notebooks communicate. But as part of my editing feedback, I could include spelling and grammatical fixes. Or I could just ignore them if you prefer.

Thoughts @dhruvbalwada?

@dhruvbalwada
Copy link

@AnonymousFool - If you have the time to make the edits in a new branch, it would be great and very much appreciated.

@IamShubhamGupto
Copy link

@AnonymousFool let us know how the review is progressing.

If you face any further technical difficulties, reach out to me here / open an issue and I'll be addressing it

@magsol
Copy link

magsol commented May 31, 2024

Hi @AnonymousFool and @Micky774, thanks so much for your help so far! I still see some items in your checklists that haven't been addressed. Are you waiting for feedback, or would you be able to continue your reviews?

@Micky774
Copy link

@magsol I'll be updating my review this upcoming week, but afaik still waiting on changes in the repository to address the current given feedback as well.

@magsol
Copy link

magsol commented Jun 5, 2024

Hi @dhruvbalwada, the reviewers are indicating that they're waiting on changes on your end. Can you provide an update on how that's going?

@magsol
Copy link

magsol commented Jun 12, 2024

Hi @dhruvbalwada , @IamShubhamGupto: I saw you working on the feedback from @AnonymousFool, but I'm not clear on whether you have addressed the feedback from @Micky774 yet. I'd like to see if we can wrap this up soon; are you waiting on anything from the reviewers?

@dhruvbalwada
Copy link

Hi @magsol @Micky774 @AnonymousFool - we have made all the appropriate changes to the repo and the paper according to your suggestions. Please let us know what else to address and how to proceed.

@IamShubhamGupto
Copy link

@magsol @Micky774 thank you for reviewing our work so far and waiting for the new changes. I believe as of today all the remaining requested changes have been published except for releasing version v1.0. Since we would have to recreate the release to incorporate newer commits, I would keep this as the last step.

Let me know if the current version of the repository is ready and the release will be created subsequently

@AnonymousFool
Copy link

Alright, yeah, I think the latest round of edits has covered the whole checklist without problems.

I hope at some point to get around to those edit suggestions I want to do, but I seem to have bogged myself down in other problems, and I see no reason to prevent publishing what I already believe is a well-functioning educational resource.

@editorialbot
Copy link
Collaborator Author

I'm sorry human, I don't understand that. You can see what commands I support by typing:

@editorialbot commands

@magsol
Copy link

magsol commented Aug 23, 2024

@editorialbot set 10.5281/zenodo.13357587 as archive

@editorialbot
Copy link
Collaborator Author

Done! archive is now 10.5281/zenodo.13357587

@magsol
Copy link

magsol commented Aug 23, 2024

@editorialbot recommend-accept

@editorialbot
Copy link
Collaborator Author

Attempting dry run of processing paper acceptance...

@editorialbot
Copy link
Collaborator Author

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

✅ OK DOIs

- 10.1017/cbo9780511617652.004 is OK
- 10.1093/acrefore/9780190228620.013.826 is OK

🟡 SKIP DOIs

- None

❌ MISSING DOIs

- None

❌ INVALID DOIs

- None

@editorialbot
Copy link
Collaborator Author

👋 @openjournals/jose-eics, this paper is ready to be accepted and published.

Check final proof 👉📄 Download article

If the paper PDF and the deposit XML files look good in openjournals/jose-papers#150, then you can now move forward with accepting the submission by compiling again with the command @editorialbot accept

@editorialbot editorialbot added the recommend-accept Papers recommended for acceptance in JOSE. label Aug 23, 2024
@dhruvbalwada
Copy link

dhruvbalwada commented Aug 27, 2024

@magsol - wondering if we need to do anything else for the final/next step?

@magsol
Copy link

magsol commented Aug 28, 2024

@dhruvbalwada Nope, only @openjournals/jose-eics can make the final acceptance. We just have to sit tight :)

@dhruvbalwada
Copy link

@magsol - I noticed that the tag @openjournals/jose-eics doesn't seem to link to anything. Is there anything else we can do to help push things along?

@magsol
Copy link

magsol commented Sep 11, 2024

@dhruvbalwada Like the checkboxes in the editor checklist above, that link requires certain permissions to view; suffice to say it does link to the list of EiCs for JOSE.

That said, we still just have to sit tight. I'm pinging the EiCs but otherwise they're the only ones who can finalize the acceptance. I know it's taking awhile and I deeply appreciate your patience; we're almost there.

@IamShubhamGupto
Copy link

@magsol does JOSE has a publication frequency? such as monthly releases? If so, with the new month almost here, I was hoping we could see some progress on this end.

Thanks

@magsol
Copy link

magsol commented Sep 30, 2024

@IamShubhamGupto There is no publication frequency with JOSE, it happens instantly once all the checks have been made. The simple fact that is we are experiencing a massive monthslong backlog that started with the COVID pandemic and which we are still struggling to get out from under. I understand wanting to get this published as soon as possible, and it will, but for now we just have to wait. I know this isn't the answer you want, but I do appreciate your patience as we work through the backlog.

@IamShubhamGupto
Copy link

@magsol Thank you for the clarification! As of now I see two other issues with the recommend-accept label, is this the backlog you are referring to?

@magsol
Copy link

magsol commented Oct 1, 2024

Partially. The backlog extends well beyond JOSE, as the EiCs have many competing responsibilities.

@labarba
Copy link
Member

labarba commented Oct 10, 2024

Post-Review Checklist for Editor and Authors

Additional Author Tasks After Review is Complete

  • Double check authors and affiliations (including ORCIDs)
  • Make a release of the software with the latest changes from the review and post the version number here. This is the version that will be used in the JOSE paper.
  • Archive the release on Zenodo/figshare/etc and post the DOI here.
  • Make sure that the title and author list (including ORCIDs) in the archive match those in the JOSE paper.
  • Make sure that the license listed for the archive is the same as the software license.

Editor Tasks Prior to Acceptance

  • Read the text of the paper and offer comments/corrections (as either a list or a PR)
  • Check the references in the paper for corrections (e.g. capitalization)
  • Check that the archive title, author list, version tag, and the license are correct
  • Set archive DOI with @editorialbot set <DOI here> as archive
  • Set version with @editorialbot set <version here> as version
  • Double check rendering of paper with @editorialbot generate pdf
  • Specifically check the references with @editorialbot check references and ask author(s) to update as needed
  • Recommend acceptance with @editorialbot recommend-accept

@labarba
Copy link
Member

labarba commented Oct 10, 2024

The title and author list on the Zenodo archive do not match the paper title and authors. Please check. (On Zenodo you can just update the metadata, there's no need for a new archive version. Note that you may not want Zenodo to do automatic updates of versions with each release, and that Zenodo pulls from committers into the author list.)

@labarba
Copy link
Member

labarba commented Oct 10, 2024

License mismatch: I note that the Zenodo archive shows Creative Commons Attribution 4.0 International, while the GitHub repository shows MIT license. Licenses should match.

Relatedly, the footer of the JupyterBook just has a copyright notice, with no license listed. This could mislead readers that visit only the rendered book to think the materials are all-rights-reserved. Suggest modifying the footer to add license info.

And since this submission has both narrative content and code, you may consider dual licensing: MIT for code, CC-BY for text and figures.

@dhruvbalwada
Copy link

dhruvbalwada commented Oct 10, 2024

After updating the footer and licensing, would you like us to do another release and update the zenodo with it?

The reason Zenodo is not matching is because there are some authors that did not directly contribute into github and also because most authors don't have their right affiliations on github. This means that we have to manually update the zenodo meta data. We are happy to do this, but would like to do this only after all other changes have been made. Unless you know of some way to copy the author metadata from an older zenodo version - the first few versions have the right authors?? However, since the review process keeps asking for new version after new version, we just gave up on updating zenodo meta data for so many authors and decided to wait till the end.

@labarba
Copy link
Member

labarba commented Oct 10, 2024

The handling editor @magsol has issued recommend-accet so there should not be any more changes, but please go over the post-review checklist and double check everything.

@labarba
Copy link
Member

labarba commented Oct 11, 2024

After updating the footer and licensing, would you like us to do another release and update the zenodo with it?

The licensing in Zenodo is selected on metadata, so it would not require a new version; it is up to you if you want to update the archive after minor edits like the footer (which just changes a config file).

@IamShubhamGupto
Copy link

@labarba @dhruvbalwada

Let me know if this preview is good for the website:
Screenshot 2024-10-11 at 4 01 34 PM

Its replicated across all pages, if all is good, ill make a PR

@labarba
Copy link
Member

labarba commented Oct 11, 2024

Your could list the copyright owners as well as the year, and then the license. Like:

(c) Copyright 2024 Dhruv Balwada, Ryan Abernathey, Shantanu Acharya, et al. — License: MIT for code, CC-BY for text and figures.

@IamShubhamGupto
Copy link

@labarba

Screenshot 2024-10-11 at 4 34 59 PM

@labarba
Copy link
Member

labarba commented Oct 11, 2024

Doubly licensed Jupyter notebooks are the norm nowadays, but are not handled gracefully by hosting services. You could leave the CC-BY license in Zenodo (I think it admits only one, but haven't checked in a while.) In the GitHub repo, you could add a License notice in the README.

@IamShubhamGupto
Copy link

Once merged, we will have to update the version one more time on GitHub if im not wrong

@labarba
Copy link
Member

labarba commented Oct 11, 2024

We don't require a new version tag for minor tweaks like this, but up to you.

@dhruvbalwada
Copy link

@labarba - I believe we have made all the changes you requested.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Jupyter Notebook Python recommend-accept Papers recommended for acceptance in JOSE. review TeX
Projects
None yet
Development

No branches or pull requests

7 participants