In[1]:= \[Theta]1[t_] := 2 Sin[t]; \[Theta]2[t_] := 3 Cos[t];
In[2]:= SumOfSqr = ((2 Cos[\[Theta]1[t]] - 1.5 Sin[\[Theta]1[t]]) - (0.08 Cos[\[Theta]2[t]]^2 - 0.08 Sin[\[Theta]2[t]]^2 + 0.05/\[Theta]1[t] Cos[\[Theta]1[t]/2]))^2 + ((2 Sin[\[Theta]1[t]] + 1.5 Cos[\[Theta]1[t]]) - (2*0.08 Cos[\[Theta]2[t]] Sin[\[Theta]2[t]] + Sin[\[Theta]1[t]/2]))^2;
In[3]:={\[Theta]1[t], \[Theta]2[t], t, SumOfSqr} /. NMinimize[SumOfSqr, t, Method->"RandomSearch"][[2]]
Out[3]= {-1.97796, -0.44415, -1.71939, 2.65114}
In[4]:= {\[Theta]1[t], \[Theta]2[t], t, SumOfSqr} /. NMinimize[{SumOfSqr, {\[Theta]1[t]>0, t>0}}, t, Method->"RandomSearch"][[2]]
Out[4]= {0.0261052, 2.99974, 0.013053, 2.43784}
Given well defined functions of time for Theta1 and Theta2 this rapidly tries to find {Theta1, Theta1, time, sum of squares for those values}.
Notice that I have formed a single sum of squares from your original set of equations to be able to minimize a magnitude and try to find the solution.
As you can see from the second example, you are also free to introduce constraints on the value ot Theta1, Theta2 and t if needed for your system.
Study all the details of NMinimize in the help page and click on all the tabs and little triangles to see additional hidden information.
You can also put the NMinimize inside a Table[] function and use the iteration value of the Table to define the constraints for each solution you desire.
The problem with this approach is that there are many local minima that NMinimize is happy to find, thus a non-zero sum of squares.
Perhaps if you work with the constraints enough you can find real solutions to your system, possibly by doing a large table with small increments in time and then selecting the results that have a very small SumOfSqr.
I had expected and hoped that this approach would be better at easily finding real solutions for your system.
Perhaps your actual functions for Theta1 and Theta2 will provide much better results.