Airport B is 300 mi from airport A at a bearing N 50degree E

Airport B is 300 mi from airport A at a bearing N 50degree E (see the figure). A pilot wishing to fly from A to B mistakenly flies due east at 250 mi/h for 20 minutes, when he notices his error. How far is the pilot from his destination at the time he notices the error? (Round your answer to one decimal place.) mi What bearing should he head his plane in order to arrive at airport B? (Round your answer to the nearest degree.)

Solution

a) Using law of cosines to find the distance between the plane and airport B

Let the third corner of triangle be deonted as C

So, we have AC = 250*(20/60)= 83.33 miles

BC = AB^2 +AC^2 -2AB*ACCcos 40

= (300)^2 + (83.33)^2 -2*83.33*300cos40

=242.16 miles

b) Using sine rule in the triangle to find angle ACB = x

300/sinx = 242.16/sin40

sinx =0.796

x = 52.77 deg

x = 127.23 deg

new heading=90- 127.23 = N37.23W

pilot is 242.16 miles from his destination at the time he notices the error
 Airport B is 300 mi from airport A at a bearing N 50degree E (see the figure). A pilot wishing to fly from A to B mistakenly flies due east at 250 mi/h for 20

Get Help Now

Submit a Take Down Notice

Tutor
Tutor: Dr Jack
Most rated tutor on our site