I have a field that should only accept integers because my backend can’t accept float type. When using Yup.number().integer()
to achieve this, it works fine for whole numbers like 2, 3, 100, etc. However, when I enter a value like 1.0, Yup treats it as an integer instead of a decimal and does not throw a validation error.
Can we configure Yup.number().integer()
to consider 1.0 as a decimal? Or is there a different approach I should take to validate integer values in Yup? Any insights or suggestions would be greatly appreciated. Thank you!
I have tried to use test() with Number.isInteger, but it’s not working
const integerValidation = (value) => {
if (!Number.isInteger(value)) {
return false;
}
return true;
};
const schema = Yup.object().shape({
yourIntegerField: Yup.number().test('integer', 'Please enter an integer value', integerValidation),
});
2
Answers
yup.integer(message?: string | function) takes only one argument and that is the error message that is displayed when the test fails so it doesn’t seem like it is possible to configure it like you want.
After reading up on this answer it seems like Number.isInteger(1.0) also will return true so your integerValidation will work exactly like yup.integer() does. However, as the answer suggests, 1 and 1.0 are exactly the same in javascript and there is no way of differentiation the two. If you really have to differentiate the two you need to treat them as strings and check if the string
includes('.')
. You would have to change your validation function like the example below and you also need to pass the value from the schema to the validation function for it to run.I think so, above validation is schema is working fine. Please check the following schema, which parsed only the number(integer).