That's a moderately complicated problem in spherical trigonometry, if the distances involved are more than a few miles. For sufficiently short distances, plane trigonometry can be used.
The formulas involved can be subject to rounding errors that make the results somewhat inaccurate. Thus all calculations should use the maximum precision available.
I have tried to derive a formula into which you could insert latitude and longitude of your known points. Had I been successful, it would have covered several pages this size, and would only be applicable for the problem geometry I assumed. You can make use of the identities here to solve the problem for the particular set of angles you have. Here is an approach I believe will be successful.
1) Define point 1 to be the reference on the west side of your bird's location in the northern hemisphere, and point 2 to be the reference point on the east side. (Reverse east/west for southern hemisphere locations. This is so the problem can be conveniently talked about in the following.)
2) In accordance with the nomenclature used at the identities link, define "C" to be the difference in longitude between point 1 and point 2. Define "a" to be the complement of the latitude of point 2, and "b" to be the complement of the latitude of point 1. Solve for angular distance "c" between point 1 and point 2 using equation (13).
3) Using the definitions of "a" and "b" and "C" above, and the newly computed value for "c", use equations (18) to determine "A" and "B", the surface angles at points 1 and 2, respectively. Note that angle A is the bearing of point 2 from point 1, and angle B is the negative of the bearing of point 1 from point 2. (Positive angles are clockwise from the direction of the nearest pole.)
4) Consider now the new triangle (point 1)-(point 2)-(bird). We now make new definitions of the variables we were using above, and use them to solve this triangle.
new A = magnitude of difference in bearing from point 1 of bird and point 2.
new B = magnitude of difference in bearing from point 2 of bird and point 1.
Please note that we'd like these differences to be less than 180 degrees and positive. (I have in mind that point 1, point 2, and the bird are in the same hemisphere.)
new a = angular distance from point 2 to the bird (as measured at the center of the Earth)
new b = angular distance from point 1 to the bird (as measured at the center of the Earth)
5) Solve for "new a" and "new b" using equations (53) and (54) to find (a-b) and (a+b). A = ((a+b)+(a-b))/2, b = ((a+b)-(a-b))/2
6) Now, we can find the latitude of the bird by making another variable mapping and solving a new triangle.
Let "new c" = the value of "new b" we just found
Let "A" be the bearing of the bird at point 1.
Using "A", "old b" (from step 2), and "new c" and equation (11), find "a". That will be the complement of the latitude of the bird.
7) Using the newly found "a", "A", and "new c" from step 6, compute "new C" from equations (18). This "new C" is the difference in longitude between point 1 and the bird in the direction of point 2. In other words, in the western hemisphere, subtract this angle from the longitude of point 1 to get the longitude of the bird.
Conclusion. In step 6, we computed the bird's latitude. In step 7, we computed the bird's longitude. In step 5, we computed the angular distances from points 1 and 2 to the bird. Multiplying these (in radians) by the Earth radius (6372.795 km on average) will give the great-circle distance from those points to the bird.
I hope the variable transformations are not too confusing. I did it only so that the applicable reference formulas would be easy to figure out.