Contributing to Modelina
First of all, thank you ππΎββοΈ for considering contributing to Modelina; it needs all the help it can get!
This contribution guide is an extension to the core contributing guide that can be found here. Please make sure you go through that beforehand. πππ½
If you have any questions, are unsure how your use-case fits in, or want something clarified, don't hesitate to reach out on slack, we are always happy to help out!
Acceptance criteria and process
Even though we love contributions, we need to maintain a certain standard of what can be merged into the codebase.
The below sections provide information about our acceptance criteria, based on the type of contribution you make.
Fixing bugs
The Acceptance Criteria for fixing any bug means that you should be able to reproduce the error using tests that will fail, unless a fix is implemented.
New features
The Acceptance Criteria for adding new features requires a few things in order to be accepted. This ensures all features are well described and implemented before being released.
- Not all feature requests from the community (or maintainers!) are accepted: Even though you are welcome to create a new feature without an issue, it might be rejected and turn out to be a waste of your time. We don't want that to happen, so make sure to create an issue first and wait to see if it's accepted after community discussion of the proposal.
- When creating tests for your new feature, aim for as high coverage numbers as possible: When you run the tests (
npm run test
), you should see a./coverage/lcov-report/index.html
file being generated. Use this to see in depth where your tests are not covering your implementation. - No documentation, no feature: If a user cannot understand a new feature, that feature basically doesn't exist! Remember to make sure that any and all relevant documentation is consistently updated.
- New features such as new presets, generators or inputs, etc, need associated use case documentation along side examples. This is not only to showcase the feature, but to ensure it will always work. Checkout our adding examples doc for more information on how to do this.
Adding examples
The Acceptance Criteria Process for adding examples is not only something we use to showcase features, but also to ensure those features always work. (This is important since it is picked up by our CI system.)
Adding examples is quite straight forward, so don't feel shy! Here's how to do it:
- Duplicate the TEMPLATE folder and rename it to something that makes sense for your feature. If you can't think of anything, feel free to go with your first thought, since we can always discuss it in the PR afterwards.
- Rename the following package configuration to the same name as your directory.
- Adapt this source code example to reflect your use case.
- Adapt this testing file for your use case. In most cases, it could be as simple as changing the title of the test!
- Add your example to our overall list of examples.
Aaaand you are done! π
Adding a new preset
Presets are for when you want to customize the generated output, they work like middleware that layers on top of each other, you can read more about presets here.
Here is how you add a new preset:
- All presets are located under
src/generators/${language}/presets
, either duplicate an existing preset and adapt it or create an empty TypeScript file. - The preset file has the syntax:
1export const LANGUAGE_MY_PRESET: LanguagePreset = { 2 class: { 3 // Add preset hooks here 4 }, 5 // enum: { 6 // Add preset hooks here 7 // } 8};
Replace LANGUAGE
with the generator the preset is for (for example TYPESCRIPT
), and replace LanguagePreset
with the generator the preset is for (for example TypeScriptPreset
). It is optional which models you add preset hooks for, i.e. you can add preset hooks for enum
alongside for class
, but it's not required. Each generator has a set of outputs you can change, read more about the presets here.
- Add your preset to the
src/generators/${language}/presets/index.ts
file. - Add an example to showcase your new feature.
- Add documentation to the language docs that explain the use case and links to your new example.
- In most cases you want to add specific tests for edge cases or simply to test the preset. To do this add a new test file in
test/generators/${language}/presets/MyPreset.spec.ts
and replaceMyPreset
with your preset name. Now add a test using the following syntax:
1describe('LANGUAGE_MY_PRESET', () => { 2 let generator: LanguageGenerator; 3 beforeEach(() => { 4 generator = new LanguageGenerator({ presets: [LANGUAGE_MY_PRESET] }); 5 }); 6 7 test('should render xxx', async () => { 8 const input = { 9 $id: 'Clazz', 10 type: 'object', 11 properties: { 12 min_number_prop: { type: 'number' }, 13 max_number_prop: { type: 'number' }, 14 }, 15 }; 16 const models = await generator.generate(input); 17 expect(models).toHaveLength(1); 18 expect(models[0].result).toMatchSnapshot(); 19 }); 20});
Remember to replace LANGUAGE
and Language
with the appropriate values.
Aaaand you are done! π
Adding a new input processor
Input processors are the translators from inputs to MetaModel (read more about the input processing here).
Here is how you can add a new input processor:
- Duplicate the template input processor and rename it to the input you are adding a processor for.
- Adapt the
shouldProcess
function which is used to detect whether an input processor should process the provided input. - Adapt the
process
function which is used to convert the input into meta models. - Duplicate the template input processor tests and rename it to the input you are adding a processor for.
- Adapt the testing code based on your input and the expected MetaModel conversion.
- Export your input processor
- Add your input processor as part of the main input processor
- Add a test for the main input processor to ensure that your input processor are accessed accordingly.
Thats it for the code and tests, now all that remains is docs and examples! π₯
- Add a new example showcasing the new supported input.
- Add the usage example to the usage document.
- Add the new supported input to the main readme file.
Aaaand you are done! π
Adding a new generator
Generators sits as the core of Modelina, which frames the core concepts of what you can generate. Therefore it's also no small task to create a new one, so dont get discourage, we are here to help you!
To make it easier to contribute a new generator, and to avoid focusing too much of the internals of Modelina, we created a template generator to get you started. If you encounter discreprencies with the following guide or templates, make sure to raise it as an issue so it can be fixed!
Getting started
- Start by copy/pasting the template generator and tests and rename it to your generator.
- Search and replace within your new generator and test folder for
Template
,template
andTEMPLATE
and replace it with your generator name and match the cases. Make sure you search and replace it with matching case. - Replace the filenames
Template...
with your generator name. - Add your generator to the generator index file.
Now it's time to adapt the template into what ever it is you are generating:
- Adapt the constraint logic and the type constraints based on what is allowed within your output. Read more about the constraint logic here.
- Add all of the reserved keywords that the models must never generate in the Constant file.
- Adapt/create the first renderers. The template by default include two renderers, one for rendering enums and one for classes, but you can create what ever renderers makes sense. For example in Rust it's not called class but struct, so therefore its called a
StructRenderer
. - Adapt the file generator and the rendering of the complete models to fit your generator.
An important note about presets, they are used to extend and build upon the default model, the bare minimum of a data model, so that Modelina can support multiple features. You can read more about presets here. If you have any questions or want something clarified, don't hesitate to reach out on slack.
Time to adapt the tests, cause without tests, it's just an empty promise. The test that is included in the template is really just placeholders, so make sure you adapt them accordingly to your code.
- Add a mocked renderer in the TestRenderers file.
- Adapt the constrainer tests based on the output.
- Adapt the reserved keywords tests
- Adapt the generator tests
- Adapt the renderer tests
- Add your generator to the FileGenerators test to ensure the models are accurately written to files.
Lastly, we need to adapt some of the docs to showcase your new awesome generator! Cause if the users cant find it, it dont exist.
- Add your generator specific documentation under languages and add it to the list of generators
- Add your generator to the list of generators in the main readme file
- Add a basic usage example to the usage documentation, you can see more about how to create examples here.
Aaaand that's it! As a rule of thumb, start small and slowly add more features, don't try to push everything into one PR, as it will take forever to review, code, and merge.
PR's you can look to for guidance on how the process goes:
FAQs
Below are some of the typical questions we've received about contributing to Modelina.
Can I solve issues not labeled "good first issue"?
Absolutely!
Regular issues are generally not that well described in terms of what needs to be accomplished and require some internal knowledge of the library internals.
If you find an issue you would like to solve, ping one of the maintainers to help you get started. Some issues may require a higher level of effort to solve than might be easily described within the issue, so don't feel shy to chat with us about individual issues. π
What does the CI system do when I create a PR?
Because the CI system is quite complex, we've designed it so that individual contributors don't need to understand in depth details.
That said, here is a general rundown on what's triggered by each PR:
- We inherit all AsyncAPI core GitHub workflows, including the most important one:
- A standard PR workflow which ensures that the following commands need to succeed:
npm run test
,npm run lint
, andnpm run generate:assets
.
- A standard PR workflow which ensures that the following commands need to succeed:
- Coverall ensures we get test coverage statistics in each PR, thus ensuring we see how it affects overall test coverage. It creates a comment on the PR with the coverage status.
- SonarCloud runs a code analysis to ensure no bugs, security concerns, code smells, or duplicated code blocks. Make sure you address any concerns found by this bot, because it generates a comment to the PR if it finds any issue.
At the end of the day, sometimes checks just fail, based on weird dependency problems. If any test failures occur that don't look like a problem you can fix, simply tag one of the maintainers. We're there to help! π