Skip to content

Comments

Add OLMo April + July 2024 Checkpoints & Config#547

Merged
2015aroras merged 11 commits intomainfrom
shanea/add-olmo-1.7-7b-to-readme
Aug 15, 2024
Merged

Add OLMo April + July 2024 Checkpoints & Config#547
2015aroras merged 11 commits intomainfrom
shanea/add-olmo-1.7-7b-to-readme

Conversation

@2015aroras
Copy link
Collaborator

No description provided.

paths:
######### NON WEB DATA #########
# ~> GUTENBERG BOOKS (5.256 GT)
- s3://ai2-llm/preprocessed/olmo-mix/v1_6-decontaminated/books/gpt-neox-olmo-dolma-v1_5/part-0-00000.npy
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

https data paths not yet available

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

@dumitrac
Copy link
Contributor

dumitrac commented May 1, 2024

@2015aroras - do you think that for 1.7 we should have two configs? I understand that training was stopped, config changed, then resumed.

@soheeyang
Copy link

Hi, thank you so much for the work! Is there an estimated date of when all the details including the https links of the training data and the data order files are going to be released?

@2015aroras
Copy link
Collaborator Author

@kyleclo @soldni Do you have any updates on the data side?

@2015aroras 2015aroras marked this pull request as ready for review July 31, 2024 00:08
@2015aroras 2015aroras requested a review from epwalsh July 31, 2024 00:08
@2015aroras 2015aroras changed the title Add OLMo 1.7-7b README + Config Add OLMo April + July 2024 Checkpoints & Config Jul 31, 2024
@2015aroras 2015aroras merged commit 1e71ce3 into main Aug 15, 2024
@2015aroras 2015aroras deleted the shanea/add-olmo-1.7-7b-to-readme branch August 15, 2024 21:32
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants