javascript - What's the difference when defining var as 0.5 compared to .5? -


I'm curious, I've already programmed JavaScript in a few years but when I look at the following variable declarations Illusion: (OCC. These can be any other number).

  var exampleOne = 0.5; Var example: two = .5;  

What is the difference between the two, or is there anybody? Do I not clearly understand the hidden benefits of some kind?

There is no difference. / P>

Parallel is parsed - that is, both 0.5 and .5 (as .50 ) Same number (Unlike most other languages, javascript has only one type of number.)

i always before decimal [optional] leading 0 I like to include.


Comments

Popular posts from this blog

ios - How do I use CFArrayRef in Swift? -

eclipse plugin - Run java code error: Workspace is closed -

c - Error on building source code in VC 6 -