For a JavaScript solution, create an array of size 5 for the output. Then loop through every index. If the given number is larger than the current index, output 100. If the number minus the index is between 0 and 1, then output the decimal part times 100. If none of those conditions are true, then output 0.

```
function createStars(n) {
const output = Array(5);
for (let i = 0; i < 5; i++) {
let num = n - i;
if (num >= 1) output[i] = 100;
else if (num >= 0) output[i] = num*100;
else output[i] = 0;
}
return output;
}
console.log(createStars(3.5));
console.log(createStars(4.7)); // The only reason that this is [100,100,100,100,70.00000000000001] is because of floating point precision errors
```

A TypeScript solution would be the exact same, except just with added typedefs.

```
function createStars(n: number): [number, number, number, number, number] {
const output: [number, number, number, number, number] = [0,0,0,0,0];
for (let i = 0; i < 5; i++) {
let num = n - i;
if (num >= 1) output[i] = 100;
else if (num >= 0) output[i] = num*100;
else output[i] = 0;
}
return output;
}
console.log(createStars(3.5));
console.log(createStars(4.7)); // The only reason that this is [100,100,100,100,70.00000000000001] is because of floating point precision errors
```

You could multiply the wanted value with `100`

and take a minimum value or `100`

for mapping and subtract this value from the wanted result.

```
function createStars(v) {
v *= 100;
return Array.from({ length: 5 }, () => {
const r = Math.min(v, 100);
v -= r;
return r;
})
}
console.log(...createStars(3.5)); // [100, 100, 100, 50, 0]
console.log(...createStars(1.6667)); // [100, 66.67, 0, 0, 0]
```

And here’s a TypeScript solution, which is just the JavaScript solutions with added typedefs.

```
function createStars(v: number): number[] {
v *= 100;
return Array.from({ length: 5 }, () => {
const r = Math.min(v, 100);
v -= r;
return r;
})
}
console.log(...createStars(3.5)); // [100, 100, 100, 50, 0]
console.log(...createStars(1.6667)); // [100, 66.67, 0, 0, 0]
```