赞
踩
广播:一对多,这些内容都是相对比较专业的
交互提供了丰富的关于用户、内容的信息
美国大选
![]() | ![]() |
---|
![]() | ![]() |
---|
区别社团和clique
E s V s ( V s − 1 ) 2 ≥ γ \frac{E_s }{\frac{V_s(V_s - 1)}{2}} \geq \gamma 2Vs(Vs−1)Es≥γ
分母是最大度数
< ∣ V s ∣ γ ≤ 2 ∣ E s ∣ ∣ V s ∣ − 1 < |V_s|\gamma \leq\frac{2|E_s|}{|V_s|-1} <∣Vs∣γ≤∣Vs∣−12∣Es∣
s i m i l a r i t y = cos ( θ ) = A ∩ B ∣ ∣ A ∣ ∣ ∗ ∣ ∣ B ∣ ∣ s i m ( 5 , 8 ) = 1 2 ∗ 3 = 1 6 similarity = \cos(\theta) = \frac{A \cap B}{||A||*||B||} \\ sim(5,8) = \frac{1}{\sqrt{2} * \sqrt{3}} = \frac{1}{\sqrt{6}} \\ similarity=cos(θ)=∣∣A∣∣∗∣∣B∣∣A∩Bsim(5,8)=2 ∗3 1=6 1
J ( A , B ) = ∣ A ∩ B ∣ ∣ A ∪ B ∣ J ( 5 , 8 ) = ∣ { 6 } ∣ ∣ { 1 , 2 , 6 , 13 } ∣ = 1 4 J(A, B) = \frac{|A \cap B|}{|A \cup B|} \\ J(5, 8) = \frac{|\{6\}|}{|\{1, 2, 6, 13\}|} = \frac{1}{4} \\ J(A,B)=∣A∪B∣∣A∩B∣J(5,8)=∣{1,2,6,13}∣∣{6}∣=41
S S T = − 1 2 ∗ ( I − 1 n e ∗ e T ) D ( I − 1 n e ∗ e T ) = △ ( D ) SS^T = -\frac{1}{2} *(I - \frac{1}{n}e*e^T)D(I - \frac{1}{n}e*e^T) = \triangle(D) SST=−21∗(I−n1e∗eT)D(I−n1e∗eT)=△(D)
min S , Σ ∣ ∣ A − S Σ S T ∣ ∣ F s . t . S ∈ 0 , 1 n ∗ k , Σ ∈ R k ∗ k i s d i a g o n a l \min\limits_{S, \Sigma}||A - S \Sigma S^T||_F \\ s.t.\ S \in{0, 1}^{n * k}, \Sigma \in R^{k * k} is\ diagonal S,Σmin∣∣A−SΣST∣∣Fs.t. S∈0,1n∗k,Σ∈Rk∗kis diagonal
c u t ( C 1 , C 2 , C 3 , . . . , C k ) = ∑ i = 1 k c u t ( C i , C i ‾ ) cut(C_1, C_2, C_3, ..., C_k) = \sum\limits_{i = 1}\limits^{k}cut(C_i, \overline{C_i}) cut(C1,C2,C3,...,Ck)=i=1∑kcut(Ci,Ci)
R a t i o − c u t ( C 1 , C 2 , . . . , C k ) = ∑ i = 1 k c u t ( C i , C i ‾ ) ∣ V i ∣ N o r m a l i z e d − c u t ( C 1 , C 2 , . . . , C k ) = 1 k ∑ i = 1 k c u t ( C i , C i ‾ ) v o l ( V i ) Ratio-cut(C_1, C_2,... , C_k) = \sum\limits_{i=1}\limits^k\frac{cut(C_i, \overline{C_i})}{|V_i|} \\ Normalized-cut(C_1, C_2, ... , C_k) = \frac{1}{k}\sum\limits_{i = 1}\limits^{k}\frac{cut(C_i, \overline{C_i})}{vol(V_i)} Ratio−cut(C1,C2,...,Ck)=i=1∑k∣Vi∣cut(Ci,Ci)Normalized−cut(C1,C2,...,Ck)=k1i=1∑kvol(Vi)cut(Ci,Ci)
min
S
∈
R
n
∗
k
T
r
(
S
T
L
S
)
s
.
t
.
S
T
S
=
I
L
=
D
−
A
D
=
[
d
1
0
.
.
.
0
0
d
2
.
.
.
0
.
.
.
.
.
.
.
.
.
.
.
.
0
0
.
.
.
d
n
]
n
o
r
m
a
l
i
z
e
d
−
L
=
I
−
D
−
1
2
A
D
−
1
2
\min\limits_{S \in R^{n * k}} Tr(S^TLS)\ s.t. S^TS = I \\ L = D - A \\ D =
d i ∗ d j 2 ∗ m \frac{d_i * d_j}{2 * m} 2∗mdi∗dj
∑ i ∈ C , j ∈ C ( A i j − d i d j 2 m ) \sum\limits_{i \in C, j \in C} (A_{ij} - \frac{d_id_j}{2m}) i∈C,j∈C∑(Aij−2mdidj)
max 1 2 m ∑ C ∑ i ∈ C , j ∈ C A i j − d i d j 2 m \max \frac{1}{2m}\sum\limits_C\sum\limits_{i\in C,j\in C} A_{ij} - \frac{d_id_j}{2m} max2m1C∑i∈C,j∈C∑Aij−2mdidj
Q = 1 2 m T r ( S T B S ) Q=\frac{1}{2m}Tr(S^TBS) Q=2m1Tr(STBS)
B i j = A i j − d i d j 2 m B_{ij} = A_{ij} - \frac{d_id_j}{2m} Bij=Aij−2mdidj
max ( min ) S T r ( S T X S ) s . t . S T S = I \max(\min)_S\ Tr(S^TXS) \\ s.t. S^TS = I max(min)S Tr(STXS)s.t.STS=I
![]() | ![]() |
---|
![]() | ![]() |
---|
![]() | ![]() |
---|
数据挖掘也称为数据库知识发现(KDD,Knowledge
Discovery in Databases),是一种以无监督的方式从大型数据库中提取有用的隐藏信息的过程。
![]() | ![]() |
---|
->
tree->
graph
![]() | ![]() |
---|---|
![]() |
M I S = 实 例 图 的 最 大 独 立 设 置 大 小 数 据 库 图 中 的 边 数 MIS = \frac{实例图的最大独立设置大小}{数据库图中的边数} MIS=数据库图中的边数实例图的最大独立设置大小
求和运算:在图形上的外观
拼接操作例子
拼接操作:在图形上的外观
X = x 1 , x 2 , . . . , x n X = {x_1,x_2,...,x_n} X=x1,x2,...,xn
K ( G 1 , G 2 ) = ∑ h 1 ∑ h 2 p ( h 1 ) p ( h 2 ) K L ( l ( h 1 ) , l ( h 2 ) ) K(G_1, G_2) = \sum\limits_{h_1}\sum\limits_{h_2}p(h_1)p(h_2)K_L(l(h_1), l(h_2)) K(G1,G2)=h1∑h2∑p(h1)p(h2)KL(l(h1),l(h2))
K
L
(
l
1
,
l
2
)
=
{
1
i
f
l
1
=
l
2
2
o
t
h
e
r
w
i
s
e
K_L(l_1,l_2)=
决策树桩
h
<
t
,
y
>
(
x
)
=
{
y
i
f
t
⊆
x
−
y
o
t
h
e
r
w
i
s
e
h_{<t, y>}(x) =
g
a
i
n
(
<
t
,
y
>
)
=
∑
i
=
1
n
y
i
h
<
t
,
y
>
(
x
i
)
g
a
i
n
(
<
t
,
y
>
)
=
∑
i
=
1
n
y
i
h
<
t
,
y
>
(
x
i
)
σ ( v , w ) = ∣ Γ ( v ) ∩ Γ ( w ) ∣ ∣ Γ ( v ) ∣ ∣ Γ ( w ) ∣ \sigma(v, w) = \frac{|\Gamma(v) \cap \Gamma(w)|}{\sqrt{|\Gamma(v)||\Gamma(w)|}} σ(v,w)=∣Γ(v)∣∣Γ(w)∣ ∣Γ(v)∩Γ(w)∣
C R ( g k ′ ( c i ) ) = S C F ( g k ′ ( c i ) ) ∗ I S F ( g k ′ ( c i ) ) CR(g'_k(c_i)) = SCF(g'_k(c_i)) * ISF(g'_k(c_i)) CR(gk′(ci))=SCF(gk′(ci))∗ISF(gk′(ci))
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。