How to convert a Rust character to an integer so that "1" becomes 1?

I am trying to find the sum of the digits of a given number. For example, 134 will give 8 .

My plan is to convert a number to a string using .to_string() and then use .chars() to iterate over the numbers as characters. Then I want to convert each char in iteration to an integer and add it to a variable. I want to get the final value of this variable.

I tried using the code below to convert char to an integer:

 fn main() { let x = "123"; for y in x.chars() { let z = y.parse::<i32>().unwrap(); println!("{}", z + 1); } } 

( Playground )

But this leads to this error:

 error[E0599]: no method named 'parse' found for type 'char' in the current scope --> src/main.rs:4:19 | 4 | let z = y.parse::<i32>().unwrap(); | ^^^^^ 

This code does exactly what I want, but first I have to convert each char to a string, and then to an integer, and then increment sum by z .

 fn main() { let mut sum = 0; let x = 123; let x = x.to_string(); for y in x.chars() { // converting 'y' to string and then to integer let z = (y.to_string()).parse::<i32>().unwrap(); // incrementing 'sum' by 'z' sum += z; } println!("{}", sum); } 

( Playground )

+10
source share
4 answers

You need char::to_digit . It converts the char to the number that it represents in the given basis.

You can also use Iterator::sum to conveniently calculate the sum of the sequence:

 fn main() { const RADIX: u32 = 10; let x = "134"; println!("{}", x.chars().map(|c| c.to_digit(RADIX).unwrap()).sum::<u32>()); } 
+12
source
 my_char as u32 - '0' as u32 

Now you need to unpack a lot more about this answer.

This works because ASCII encodings (and therefore UTF-8) have Arabic numbers 0-9, sorted in ascending order. You can get scalar values ​​and subtract them.

However, what should it do for values ​​outside this range? What happens if you provide 'p' ? It returns 64. How about '.' ? It will panic. And '♥' will return 9781.

Strings are not just packets with bytes . They are UTF-8 encoded, and you cannot just ignore this fact. Each char can store any scalar Unicode value.

This is why strings are the wrong abstraction for the problem.

In terms of efficiency, line highlighting seems inefficient. Rosetta Code has an example of using an iterator that only performs numerical operations:

 struct DigitIter(usize, usize); impl Iterator for DigitIter { type Item = usize; fn next(&mut self) -> Option<Self::Item> { if self.0 == 0 { None } else { let ret = self.0 % self.1; self.0 /= self.1; Some(ret) } } } fn main() { println!("{}", DigitIter(1234, 10).sum::<usize>()); } 
+1
source

Another way is to iterate over the characters of your string and convert them and add them with fold .

 fn sum_of_string(s: &str) -> u32 { s.chars().fold(0, |acc, c| c.to_digit(10).unwrap_or(0) + acc) } fn main() { let x = "123"; println!("{}", sum_of_string(x)); } 
0
source

The parse() method is defined on str , not on char . A char is Unicode code that is 32 bits wide. If you pass it to an integer, using u32 preferable to i32 .

You can pass it through as or into() :

 let a = '♥' as u32; let b: u32 = '♥'.into(); 
-1
source

All Articles