Braket notation was invented by Dirac to denote mathematical operations usually reserved for spatial vectors in Euclidean space to conceptually equivalent operations for more abstract and general vectors in Hilbert space.
For example, we can denote the position of a particle as:
\vec{r} = x \hat{x} + y\hat{y} + z \hat{z}
where the “hats” are unit vectors that point in three perpendicular directions in Cartesian coordinates. The fact that \vec{r} is a vector is denoted by the arrow above the r. Another way to write \vec{r} could be in the form of a column vector:
\vec{r} = \left ( \begin{array}{c} x \\ y \\ z \end{array} \right )
Finally, the dot product between two spatial vector is just given by:
\vec{a} \cdot \vec{b} = \left ( \begin{array}{ccc} a_x & a_y & a_z \end{array} \right )\left ( \begin{array}{c} b_x \\ b_y \\ b_z \end{array} \right ) = a_x b_x + a_y b_y + a_z b_z
where we wrote \vec{a} as a row vector and \vec{b} as a column vector so that we could use the rules of matrix multiplication to get the correct form of the dot product.
All of these concepts discussed above for 3D spatial vector have equivalent analogies for generalized vectors in a Hilbert space:

spatial column vector \Rightarrow ket = \left  b \right >

spatial row vector \Rightarrow bra = \left < b \right 

The relationship between spatial column and row vectors is just the transpose. In Hilbert space where you need to worry about complex numbers in general, you need to take the complex conjugate transpose to go from a ket to a bra:
\left  b \right > = \left ( \begin{array}{c} b_x \\ b_y \\ b_z \end{array} \right ) \Rightarrow \left < b \right  = \left ( \begin{array}{ccc} b^*_x & b^*_y & b^*_z \end{array} \right )

dot product = \vec{a} \cdot \vec{b} \Rightarrow inner product = \left < a  b \right >

In Euclidean space, spatial vectors are rows and columns of real numbers and the basis is just a set of unit vectors. However, in Hilbert space, vectors are rows and columns of complex numbers that could represent a function and the basis could be a set of eigenfunctions. In this case, the concept of a dot product needs to be generalized to the concept of an inner product which is now an integral (see below), where the vector space is defined only on the interval x_0 \le x \le x_1 and w(x) is weight function which is often just w(x)=1 in the most simple cases, but could be more complicated in general for SturmLiouville problems.
\left < g  f \right > = \int_{x_0}^{x_1} g^*(x) f(x) w(x)\ dx
 perpendicular spatial vectors have a zero dot product \vec{a} \cdot \vec{b} = 0 \Rightarrow orthogonal generalized vectors (or functions) have a zero inner product \left < g  f \right > = 0