I’ve been doing the Weekly
Challenges. The
latest
involved an unusual binary encoding and indexing into a multiplication
table. (Note that this is open until 28 November 2021.)
Task 1: Add Binary
You are given two decimal-coded binary numbers, $a
and $b
.
Write a script to simulate the addition of the given binary numbers.
In other words, interpret 101 as 5, 1001 as 9, and add them to make
14, expressed as 1110.
It would have been possible to do this piecewise, taking each digit of
each number, but instead I chose to write a converter between this
decimal-coded notation and standard integers, particularly once I
realised that the converters to and from could use basically the same code.
Raku:
sub cvradix($n,$r,$tf) {
my $o=0;
my $nn=$n;
my $m=1;
my $ra;
my $rb;
if ($tf==0) { # convert to radix-format
$ra=$r;
$rb=10;
} else { # convert from radix-format
$ra=10;
$rb=$r;
}
while ($nn > 0) {
$o+=($nn % $ra)*$m;
$nn=floor($nn/$ra);
$m*=$rb;
}
return $o;
}
Then the function itself is trivially:
sub dcbadd($a,$b) {
return cvradix(cvradix($a,2,1)+cvradix($b,2,1),2,0);
}
and the other languages all work basically the same way.
Task 2: Multiplication Table
You are given 3 positive integers, $i
, $j
and $k
.
Write a script to print the $k
th element in the sorted
multiplication table of $i
and $j
.
I could do various optimising tricks (for example, there's no point in
calculating beyond k*k
) but the simplest approach seemed to in
avoiding prepature optimisation: just work out the actual
multiplication table, sort it, and then index into it. Rust, as an
example:
fn mtable(i: u32,j: u32,k: usize) -> u32 {
let mut l=vec![];
for ix in 1..=i {
for jx in 1..=j {
l.push(ix*jx);
}
}
l.sort();
return l[k-1];
}
Full code on
github.
Comments on this post are now closed. If you have particular grounds for adding a late comment, comment on a more recent post quoting the URL of this one.