I am trying to normalize data and realized that it's not working because I am getting the incorrect maximum. However, the comparing of numbers isn't working correctly. Here is my code:
var max = Number.MIN_VALUE;
var min = Number.MAX_VALUE;
for (i = 0; i < array.length; i++)
{
if(array[i]>max)
{
max = array[i];
}
}
for (i = 0; i < array.length; i++)
{
if(array[i]<min)
{
min = array[i];
}
}
console.log("max: " + max);
console.log("min: " + min);
for (i = 0; i < array.length; i++)
{
if(array[i]!=0)
{
if(array[i]>max)
{
console.log(i+": " + array[i] + " yes");
}
else
{
console.log(i+": " + array[i] + " no " + max);
}
}
}
I am getting a lot of console output, but here is one example:
241590: 17.5799 no 9.86874
meaning that for some reason, JS thinks that 17.5799 is not bigger than 9.86874. Why is this happening and what can I do to fix it? Thank you!
EDIT 1: When printing array[i] - max, I end up with the correct difference...except it's negative.
It's because you'r array contains strings, not numbers.
so '9.95799' > '89.86874' => true :) Use numbers instead
Collected from the Internet
Please contact [email protected] to delete if infringement.
Comments