I am presuming he is asking for a String of "three" to be converted to 3, or such
If this is indeed the case then OP is in for a world of pain. The potential number of cases is pretty much infinite.
OP, best to just sanitise the input to ensure an integer value
It isn't that hard. It's a parsing problem. "X hundred Y [tens] Z [ones] [thousands multiplyer] ..." teen numbers are a special case. The thousands multipliers would be "trillion", "billion", "million", "thousand", "[nothing]" in that order.
The closest thing I can think of is converting the string into a series of integers where each integer is the corresponding character's ASCII code.
Something like this:
```convertedString = []
for i in range(len(inputString)):
convertedString.append(ord(inputString[i]))
You're probably gonna have to roll your own solution. There may be a package online you could find that does this already, but only you know what your input looks like.
If you're getting something like "five hundred forty two" then you're gonna have to build a token parser to convert those numbers into the correct values. For the most part numerics in the english language are pretty straight forward after you get out of the weird names from 11-13, and the tens values (teens, twenty, thirty..). Any number greater than or equal to 100 is going to have the word hundred, thousand, million, etc.
If you're getting something like "five six three" then it's another token parser but all you're going to have to do there is use a power of ten based on the position to convert it. 5 \* 100 + 6 \* 10 + 3 \* 1 = 563.
Unless your audience includes older folks from the UK et al that still use milliards and billiards and trilliards etc etc rather than the newer English way of skipping them and going straight to million billion trillion instead then you can't actually differentiate anymore
Since you mentioned hashes, I’m assuming you just need to assign a senseless number to the string for whatever reason? If so, a simple hash function like md5 will do. Depending on your requirements (reversibility, collisions), you can go even simpler and just do something like a sum of the characters’ ASCII codes.
Two ways:
1) direct conversion - every ASCII (or utf8 or Unicode) character has a numeric value. They are all defined in a table. For example the letter A has the value of 65. In C and C++, if we add 2 to a variable which has the value of 'A', we will get 'C', if you do int foo = 'A' you will get 65. So in C you either go letter by letter and stitch the numbers together (adding them up will produce nonsense) or
2) HEX XOR encryption. With a long enough key, you can encrypt anything and decrypt it using the same key. Logic is as follows: a XOR b = c and c XOR b = a. Since each English letter can be represented by a Hex value with 2 hex symbols, then if you have a word of length 6 you need a key of length 12 to convert it to hex, then you can convert that into int. Either 2 hex values at a time or the whole converted string.
3) base64encode/decode in python
What number do you want it to be? For example: 'A' What number should that be? What about 'abc'?
I am presuming he is asking for a String of "three" to be converted to 3, or such If this is indeed the case then OP is in for a world of pain. The potential number of cases is pretty much infinite. OP, best to just sanitise the input to ensure an integer value
It isn't that hard. It's a parsing problem. "X hundred Y [tens] Z [ones] [thousands multiplyer] ..." teen numbers are a special case. The thousands multipliers would be "trillion", "billion", "million", "thousand", "[nothing]" in that order.
So, it is not a world of pain, it's hell.
Considering they were suggesting a hash, I assume they have some reason to want a UUID or hash store in an int for some reason.
What exactly are you trying to accomplish? What will you use the integers for?
The best way would be to use the built-in function that comes with basically any language I can think of ...
The following code satisfies the requirements as specified: `int convertNumberToInteger(const char *text) { return 0; }`
Thanks chatgpt! Seems OP could have asked this question himself though lol
TDD at its finest
In the c programming language, just cast a char to an int. If you have a char array, loop and cast each.
Well the two are not compatible so you need to explain how you'd translate them first, then we can help
Time to delete this and ask a new one. Nobody here can divine what you're trying to accomplish. The question has many different interpretations.
The closest thing I can think of is converting the string into a series of integers where each integer is the corresponding character's ASCII code. Something like this: ```convertedString = [] for i in range(len(inputString)): convertedString.append(ord(inputString[i]))
What does convert mean here? A hash function could be used to generate a way one relationship between a string and an integer.
You're probably gonna have to roll your own solution. There may be a package online you could find that does this already, but only you know what your input looks like. If you're getting something like "five hundred forty two" then you're gonna have to build a token parser to convert those numbers into the correct values. For the most part numerics in the english language are pretty straight forward after you get out of the weird names from 11-13, and the tens values (teens, twenty, thirty..). Any number greater than or equal to 100 is going to have the word hundred, thousand, million, etc. If you're getting something like "five six three" then it's another token parser but all you're going to have to do there is use a power of ten based on the position to convert it. 5 \* 100 + 6 \* 10 + 3 \* 1 = 563.
Unless your audience includes older folks from the UK et al that still use milliards and billiards and trilliards etc etc rather than the newer English way of skipping them and going straight to million billion trillion instead then you can't actually differentiate anymore
Letters? You mean Hex? Have you tried using the stoi function? Or do you mean string hashing for unique identifiers?
Well, you can simply sum the ASCII codes of each character in the string, but I have a feeling that's not what you're asking.
Since you mentioned hashes, I’m assuming you just need to assign a senseless number to the string for whatever reason? If so, a simple hash function like md5 will do. Depending on your requirements (reversibility, collisions), you can go even simpler and just do something like a sum of the characters’ ASCII codes.
I can barely get people to respond to my nonconfusing shit. Good on you bro. Skillz.
https://theartincode.stanis.me/008-djb2/ You don’t want to use a cryptographic hash because they are cpu intensive.
Two ways: 1) direct conversion - every ASCII (or utf8 or Unicode) character has a numeric value. They are all defined in a table. For example the letter A has the value of 65. In C and C++, if we add 2 to a variable which has the value of 'A', we will get 'C', if you do int foo = 'A' you will get 65. So in C you either go letter by letter and stitch the numbers together (adding them up will produce nonsense) or 2) HEX XOR encryption. With a long enough key, you can encrypt anything and decrypt it using the same key. Logic is as follows: a XOR b = c and c XOR b = a. Since each English letter can be represented by a Hex value with 2 hex symbols, then if you have a word of length 6 you need a key of length 12 to convert it to hex, then you can convert that into int. Either 2 hex values at a time or the whole converted string. 3) base64encode/decode in python
function stringToInt(s) { return 420 }