subject

There are two algorithms called alg1 and alg2 for a problem of size n. alg1 runs in n2 microseconds and alg2 runs in 100n log n microseconds. alg1 can be implemented using 4 hours of programmer time and needs 2 minutes of cpu time to develop. on the other hand, alg2 requires 15 hours of programmer time and 6 minutes of cpu time to develop. if programmers are paid 20 dollars per hour and cpu time costs 50 dollars per minute, how many times must a problem instance of size 500 be solved using alg2 in order to justify its development cost?

ansver
Answers: 1

Another question on Computers and Technology

question
Computers and Technology, 23.06.2019 01:30
For a typical middle-income family, what is the estimated cost of raising a child to the age of 18? $145,500 $245,340 $304,340 $455,500
Answers: 2
question
Computers and Technology, 24.06.2019 01:00
How can the temperature of a room be raised by 5degreesf?
Answers: 1
question
Computers and Technology, 24.06.2019 07:30
Aproject involves many computing systems working together on disjointed task towards a single goal what form of computing would the project be using
Answers: 3
question
Computers and Technology, 24.06.2019 16:00
What is a dashed line showing where a worksheet will be divided between pages when it prints? a freeze pane a split box a page break a print title
Answers: 1
You know the right answer?
There are two algorithms called alg1 and alg2 for a problem of size n. alg1 runs in n2 microseconds...
Questions
question
History, 07.07.2020 03:01
Questions on the website: 13722367