When running pandaPy with toy data via the below command (paths are relative the the root directory of netZooR), I get a runtime warning:
RuntimeWarning: Precision loss occurred in moment calculation due to catastrophic cancellation. This occurs when the data are nearly identical. Results may be unreliable.
This be affecting the stability of our results. I also get this error with other datasets.
> x = netZooR::pandaPy(expr_file = "inst/extdata/toy_expr.txt", motif_file = "inst/extdata/toy_motif_prior.txt", ppi_file = "inst/extdata/toy_ppi_prior.txt", modeProcess ="intersection")
Loading motif data ...
Elapsed time: 0.00 sec.
Loading expression data ...
Elapsed time: 0.01 sec.
Loading PPI data ...
Number of PPIs: 26
Elapsed time: 0.00 sec.
Calculating coexpression network ...
Elapsed time: 0.00 sec.
Creating motif network ...
Elapsed time: 0.00 sec.
Creating PPI network ...
Elapsed time: 0.00 sec.
intersection inst/extdata/toy_motif_prior.txt inst/extdata/toy_expr.txt inst/extdata/toy_ppi_prior.txt False False False
Normalizing networks ...
/Library/Frameworks/R.framework/Versions/4.4-arm64/Resources/library/netZooR/extdata/panda.py:283: RuntimeWarning: Precision loss occurred in moment calculation due to catastrophic cancellation. This occurs when the data are nearly identical. Results may be unreliable.
norm_row = zscore(x, axis=1)
Elapsed time: 0.00 sec.
Saving expression matrix and normalized networks ...
Elapsed time: 0.00 sec.
Running PANDA algorithm ...
step: 0, hamming: 0.2752296820144946
step: 1, hamming: 0.23896113082839454
step: 2, hamming: 0.2630215125187659
step: 3, hamming: 0.3145220112503586
step: 4, hamming: 0.3690434806726151
step: 5, hamming: 0.4038490777545354
step: 6, hamming: 0.4262564300040291
step: 7, hamming: 0.4308194301537684
step: 8, hamming: 0.42024105862038885
step: 9, hamming: 0.39793009499132426
step: 10, hamming: 0.3675265775182517
step: 11, hamming: 0.3324899312007666
step: 12, hamming: 0.29574121431397293
step: 13, hamming: 0.25948532911470756
step: 14, hamming: 0.22520306342978716
step: 15, hamming: 0.19375801318900013
step: 16, hamming: 0.16554861051754077
step: 17, hamming: 0.1406551244943592
step: 18, hamming: 0.11895871766641763
step: 19, hamming: 0.10022805251986688
step: 20, hamming: 0.08417781006558679
step: 21, hamming: 0.07050600107167179
step: 22, hamming: 0.058916435238549704
step: 23, hamming: 0.04913134486280213
step: 24, hamming: 0.04089768727885859
step: 25, hamming: 0.03398950957705744
step: 26, hamming: 0.028207949104165828
step: 27, hamming: 0.023379882987106776
step: 28, hamming: 0.019355868934264796
step: 29, hamming: 0.016007777769438004
step: 30, hamming: 0.013226361941612282
step: 31, hamming: 0.01091890390913413
step: 32, hamming: 0.009007024194430944
step: 33, hamming: 0.0074246882840646455
step: 34, hamming: 0.006116426157789724
step: 35, hamming: 0.005035762732785468
step: 36, hamming: 0.004143848439728981
step: 37, hamming: 0.0034082742418890485
step: 38, hamming: 0.002802053178586656
step: 39, hamming: 0.0023027499762391934
step: 40, hamming: 0.0018917407711360468
step: 41, hamming: 0.0015535860950818894
step: 42, hamming: 0.001275501702135251
step: 43, hamming: 0.0010469133750784856
step: 44, hamming: 0.0008590834244181245
Running panda took: 0.01 seconds!
When running pandaPy with toy data via the below command (paths are relative the the root directory of netZooR), I get a runtime warning:
This be affecting the stability of our results. I also get this error with other datasets.